Passing variables to Invoke-Command, inside inlineScript, inside workflow - powershell

I'm trying to create a script that connects to various servers, should attach a PSDrive and copy over files. The problem resides in that I an unable to pass the variable into the Invoke-Command script-block.
workflow kopijobb {
param ([string[]]$serverList, $creds, $basePath)
foreach -parallel ($server in $serverList){
# Use the sequence keyword, to ensure everything inside of it runs in order on each computer.
sequence {
#Use the inlinescript keyword to allow PowerShell workflow to run regular PowerShell cmdlets
inlineScript{
$path = $using:basePath
Write-Host "Starting $using:server using $path"
#Create session for New-PSSession
$session = New-PSSession -ComputerName $using:server -Credential $using:creds
# Copy Java and recreate symlink
Invoke-Command -Session $session -ScriptBlock {
# Make a PSDrive, since directly copying from UNC-path doesn't work due to credential-issues
New-PSDrive -Name N -PSProvider FileSystem -root $using:path -Credential $using:creds | out-null
I pass the network path to $basePath, and I'm able to read it inside the inlineScript block (where I have tried storing it in a new variable to test), but once I try accessing it in the New-PSDrive command, the variable is suddenly empty/unreachable, and the mounting of the drive fails with the error Cannot bind argument to parameter 'Root' because it is null.
I'm at a loss to why this fails, so I'm turning to the collective wisdom here instead.

If feels embarrassing to answer my own question, especially on the same day, but I bumped into a PowerShell guru at work and he took one glance at the script and saw the problem:
I had to add -Args to the Invoke-Command
Invoke-Command -Session $session -ScriptBlock {
param($srv,$login,$path,$...)
#Make a PSDrive, since directly copying from UNC-path doesn't work due to credential-issues
New-PSDrive -Name N -PSProvider FileSystem -root $path -Credential $login | out-null
} -Args $using:server,$using:creds,$using:basePath,$using:...
This does of course mean that I had to import all the needed arguments from the top level into the workflow, and then into the Invoke-Command.

Related

Issue getting pssessions to loop properly for multiple servers listed in a txt file

I start with a txt file named vms.txt.
It contains 2 servers like so:
server1
server2
When I run the script shown below, the command that is invoked to install VMware tools only runs on server2 and server1 gets skipped. Does anyone have any suggestions on how to modify this script to make it run on both servers in the txt file? I will have to run this in the future for hundreds of VMs, so I am trying to find an easy way to get this to loop properly.
$cred = Get-Credential
$vms = Get-Content C:\Scripts\Tools\vms.txt
foreach($vm in $vms){
$sessions = New-PSSession -ComputerName $vm -Credential $cred
}
foreach($session in $sessions)
{
Invoke-Command -Session $session -ScriptBlock {
c:\users\jsmith\documents\VMware-tools-10.3.5.exe /v "/qn REBOOT=R Remove=AppDefense,VMCI”
}
}
In your loop-based approach, the problem is your variable assignment:
# !! This only ever stores the *last* session created in $sessions,
# !! because the assignment is performed in *each iteration*.
foreach($vm in $vms){
$sessions = New-PSSession -ComputerName $vm -Credential $cred
}
The immediate fix is to move the assignment out of the loop:
# OK - captures *all* session objects created in $sessions
$sessions = foreach($vm in $vms){
New-PSSession -ComputerName $vm -Credential $cred
}
Taking a step back:
Both New-PSSession -ComputerName and Invoke-Command -Session accept an array of computer names / sessions, so there's no need for loops.
Passing multiple sessions / computer names to Invoke-Command has the big advantage that the operations run in parallel.
Note:
Invoke-Command has built-in throttling to avoid targeting too many machines at once. It defaults to 32, but can be modified with the -ThrottleLimit parameter.
Output from the targeted computers will arrive in no predictable order, but the output objects are decorated with (among others) a .PSComputerName property reflecting the originating computer - see the bottom section of this answer.
That is, your code can be simplified to:
$cred = Get-Credential
$vms = Get-Content C:\Scripts\Tools\vms.txt
$sessions = New-PSSession -ComputerName $vms -Credential $cred
Invoke-Command -Session $sessions -ScriptBlock {
c:\users\jsmith\documents\VMware-tools-10.3.5.exe /v "/qn REBOOT=R Remove=AppDefense,VMCI”
}
Important:
Sessions should eventually be cleaned up with Remove-PSSession when no longer needed.
However, that stops any commands running in those sessions, so if you've launched asynchronous operations via your Invoke-Command call, you need to ensure that those operations have finished first - see the comments re potentially asynchronous execution of your VMware-tools-10.3.5.exe application below.
Or, even simpler, if you only need to execute one command on each machine, in which case there is no need to create sessions explicitly, pass all computer names directly to Invoke-Command's -ComputerName parameter:
$cred = Get-Credential
$vms = Get-Content C:\Scripts\Tools\vms.txt
# Note the use of -ComputerName
Invoke-Command -ComputerName $vms -Credential $cred -ScriptBlock {
# Note the use of | Write-Output to ensure synchronous execution.
c:\users\jsmith\documents\VMware-tools-10.3.5.exe /v "/qn REBOOT=R Remove=AppDefense,VMCI” | Write-Output
}
Important:
If your application (VMware-tools-10.3.5.exe) runs asynchronously, you must ensure its
synchronous execution, otherwise it may not run to completion, because the implicitly created remote session is discarded when a script block returns from a given computer.
A simple trick to ensure synchronous execution of any external (GUI-subsystem) executable is to pipe it to Write-Output, as shown above (or Wait-Process, if it doesn't produce console output) - see this answer for an explanation.

PowerShell failing to copy from UNC path

I'm working on a script to copy a folder from a UNC path to a local server. I'm remotely running my script through an interactive session and utilizing Invoke-Command -ScriptBlock like so:
Invoke-Command -ComputerName MyServer -ScriptBlock $Script
This is the script to do the copying:
$script {
try {
New-PSDrive -Name MyDrive -PSProvider FileSystem -Root \\uncpathserver\e$\SourceCode\ -Credential Contoso\me
Copy-Item -Path \\uncpathserver\e$\SourceCode\* -Destination E:\Inetpub\Target -Recurse -Force
}
catch {
Write-Host "Failed to copy!"
}
}
It is failing and throwing my catch block every time. I can't seem to figure out what I am missing to get this to work - it seems so simple and I hope I'm not missing something blatantly obvious.
EDIT:
I was able to get it to work by now just running the script from my local PC instead of from a server. I'm calling the file copy out of $script block now as well. This is what the new code looks like:
$MyServers= #("server-01", "server-02")
foreach ($server in $MyServers)
{
$TargetSession = New-PSSession -ComputerName $server -Credential
contoso\me
Copy-Item -ToSession $TargetSession -Path C:\Source\TheCode\ -
Destination "E:\InetPub\wherethecodegoes" -Recurse -Force
}
Everything else I'm doing inside my $script block (which has been omitted here for troubleshooting sake) is working A-OK. I do have to enter my credentials for each server, but due to the small nature of servers I'm working with, that isn't a deal breaker.
Sounds like a 'kerberos double hop' problem.
Short-Answer
Avoid the problem. From your system, setup two PSdrives. Then copy \\uncpathserver\e$\SourceCode\ to \\RemoteIISserver\E$\Inetpub\Target\
Long-Answer
From your system (System A), you are remotely executing a script (on System B) that will copy a remote folder (from System C).
It should work, but it doesn't. This is because when you (specifically, your account) from System A, remotely connects to System B, then asks System C for something, 'System C' doesn't trust you.
A quick google of the problem will show a myriad of ways around this issue, however;
Not all methods are secure (example: CredSSP)
Not all methods will work on your version of Windows (which is...?)
Not all methods will work with PowerShell
One secure method that does work with PowerShell leverages delegation.
This can be a bit daunting to setup, and I suggest you read-up on this thoroughly.
## Module 'ActiveDirectory' from RSAT-AD-PowerShell Windows feature required.
$ServerA = $Dnv:COMPUTERNAME
$ServerB = Get-ADComputer -Identity ServerB
$ServerC = Get-ADComputer -Identity ServerC
Delegate 'Server B' to access 'Server C';
# Set the resource-based Kerberos constrained delegation
Set-ADComputer -Identity $ServerC -PrincipalsAllowedToDelegateToAccount $ServerB
# Confirm AllowedToActOnBehalfOfOtherIdentity.Access is correct (indirectly).
Get-ADComputer -Identity $ServerC -Properties PrincipalsAllowedToDelegateToAccount
Wait about 15 minutes for 'Server B' to sync-up (or just reboot it).
You can force this with the following (Note: $Cred should contain your credentials);
Invoke-Command -ComputerName $ServerB.Name -Credential $cred -ScriptBlock {
klist purge -li 0x3e7
}
Run a test-hop;
Invoke-Command -ComputerName $ServerB.Name -Credential $cred -ScriptBlock {
Test-Path \\$($using:ServerC.Name)\C$
Get-Process lsass -ComputerName $($using:ServerC.Name)
Get-EventLog -LogName System -Newest 3 -ComputerName $($using:ServerC.Name)
}
The downside is you have to setup every remote remote-target (every 'Server C') this way. But the upside is that it's secure.

PowerShell Invoke-Command on an advanced function

I have an advanced function Copy-FilesHC that is available in a module file. This function copies some files from the Source to the Destination folder and generates some output for in a log file.
The function works fine locally:
Copy-FilesHC -Source $Src -Destination $Des *>> $Log
It also works on a remote machine:
# For remote use we need to make it available first
Import-Module (Get-Command Copy-FilesHC).ModuleName
Invoke-Command -Credential $Cred -ComputerName $Host -ScriptBlock ${Function:Copy-FilesHC} -ArgumentList $LocalSrc, $LocalDes
However, I can't seem to figure out how I can have it pass the output to a log file like in the first command. When I try the following, it fails:
Invoke-Command -Credential $Cred -ComputerName $Host -ScriptBlock ${Function:Copy-FilesHC *>> $Log} -ArgumentList $LocalSrc, $LocalDes
Invoke-Command : Cannot validate argument on parameter 'ScriptBlock'. The argument is null. Provide a vali
d value for the argument, and then try running the command again.
As indicated here I thought the $ sign for the ScriptBlock was incorrect. But this way I don't need to put my advanced function in a ScriptBlock to copy it over as it now happens automatically while it's only available within the module. So I just need to find out how to capture the output in the log file.
Thank you for your help.
Found the solution just a few minutes ago:
# For remote use we need to make it available first
Import-Module (Get-Command Copy-FilesHC).ModuleName
Invoke-Command -Credential $Cred -ComputerName $Host -ScriptBlock ${Function:Copy-FilesHC} -ArgumentList $LocalSrc, $LocalDes *>> $Log

Powershell remote application(.cmd) deployment

I am beginner with PowerShell and struggling to get this around with the help from different sites, My requirement and scenario is
I have a windows server 2008(rktdepy) with PowerShell installed and I have packaged application with a .cmd file. When I click this .cmd file the application will be deployed.
The server name is rktdepy and I want to create a PowerShell script which will connect to other servers in the network (the server names should be picked up from a txt files) and install the application accessing the file remotely from rktdepy server. The files are not supposed to be copied to any server and should not use psxec for security reason.
So far I have used invoke and mapping the network drive but still I have issues
$Comsession = Get-content c:\adminfiles\scripts\deploy.txt | new-pssession -throttlelimit 50
Invoke-command -computername RKTDEPLY54 -scriptblock { (new-object -comobject wscript.network).mapnetworkdrive("R:", "\\rktdepy\deploy", $true) }
Invoke-command -session $comsession -scriptblock {"CMD /C r:\QR_DEPLOY.CMD"}
The above script throws error,
I dont want to use any password in the script and it should fetch the current logged in user password from rktdepy server. I is ok if the scripts prompts for a user name and password which will have admin access to all servers.
It looks like you are dealing with a couple problems. One is that the session where you map the drive is gone when you run the next Invoke-Command that uses the mapped drive. You could move that into the same script block to fix a problem like that. The second one is a "second hop" issue. See a resource like Don Jones' Secrets of PowerShell Remoting free ebook on http://powershell.org/wp/books.
Steve
I have testing the following on my machine and it is working so far. There is also another method you can try out listed below.
Method1:
1. I have txt file with a list of computers named allcomputers.txt. It contains name of machines on each line.
Machine10
Machine20
Machine30
Machine40
The deployment script (mydeploytest.ps1) which accepts Computername, Username and Password as input and creates a new PSSession and then invokes command.
param(
[string]$ComputerName,
[string]$User,
[string]$pass
)
Get-PSSEssion | Remove-PSSession
$session = New-PSSession -ComputerName $ComputerName
Invoke-Command -Session $session -ScriptBlock {
param(
[string]$ComputerName,
[string]$Username,
[string]$Password
)
$net = new-object -ComObject WScript.Network
$net.MapNetworkDrive("U:", "\\RKTDEPY\deploy", $false, $Username, $Password)
Invoke-Expression "CMD /C U:\deploy.cmd"
$net.RemoveNetworkDrive("U:")
} -args $ComputerName,$User,$pass
Get-PSSEssion | Remove-PSSession
Powershell commandline oneline to accomplish deployment task.
PS C:> Get-Content C:\scripts\allcomputers.txt | Foreach { C:\scripts\mydeploytest.ps1 $_ "yourserviceaccount" "password"}
Method2:
The help method for Invoke-Command has an example on how to solve the doublehop issue stevals is mentioning in the answer.
PS C:\> Enable-WSManCredSSP -Delegate Server02
PS C:\>Connect-WSMan Server02
PS C:\>Set-Item WSMan:\Server02*\Service\Auth\CredSSP -Value $true
PS C:\>$s = New-PSSession Server02
PS C:\>Invoke-Command -Session $s -ScriptBlock {Get-Item \\Net03\Scripts\LogFiles.ps1} -Authentication CredSSP
-Credential Domain01\Admin01
I think with little modification to method 2 you can achieve what you want.

Enter-PSSession is not working in my Powershell script

When I run the lines below from a script the file ends up being created on my local machine.
$cred = Get-Credential domain\DanTest
Enter-PSSession -computerName xsappb01 -credential $cred
New-Item -type file c:\temp\blahxsappk02.txt
exit-pssession
When I run each line individually from the powershell console the remote session is created correctly and the file is created on the remote machine. Any thoughts on why? Is it a timing issue is the script perhaps?
Not sure if it is a timing issue. I suspect it's more like Enter-PSSession is invoking something like a nested prompt and your subsequent commands are not executing within it. Anyway, I believe Enter/Exit-PSSession is meant for interactive use - not scripting use. For scripts use New-PSSession and pass that session instance into Invoke-Command e.g.:
$cred = Get-Credential domain\DanTest
$s = New-PSSession -computerName xsappb01 -credential $cred
Invoke-Command -Session $s -Scriptblock {New-Item -type file c:\temp\blah.txt}
Remove-PSSession $s