PowerShell Invoke-Command severe performance issues - powershell

I'm having a heck of a time running a script on a remote system efficiently. When run locally the command takes 20 seconds. When run using Invoke-Command the command takes 10 or 15 minutes - even when the "remote" computer is my local machine.
Can someone explain to me the difference between these two commands and why Invoke-Command takes SO much longer?
Run locally on MACHINE:get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue
Run remotely on \MACHINE (behaves the same rather MACHINE is my local machine or a remote machine: invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Filter *.pst -Recurse -Force -ErrorAction SilentlyContinue}
Note: The command returns 5 file objects
EDIT: I think part of the problem may be reparse points. When run locally get-childitem (and DIR /a-l) do not follow junction points. When run remotely they do, even if I use the -attributes !ReparsePoint switch)
EDIT2: Yet, if I run the command invoke-command -ComputerName MACHINE -ScriptBlock {get-childitem C:\ -Attributes !ReparsePoint -Force -ErrorAction SilentlyContinue} I don't see the Junction Points (i.e. Documents and Settings). So, it is clear that both DIR /a-l and get-childitem -attributes !ReparsePoint do not prevent it from recursing in to a reparse point. Instead it appears to only filter the actual entry itself.
Thanks a bunch!

It appears the issue is the Reparse Points. For some reason, access is denied to reparse points (like Documents and Settings) when the command is run locally. Once the command is run remotely, both DIR and Get-ChildItem will recurse into reparse points.
Using the -Attributes !ReparsePoint for get-childitem and the /a-l switch for DIR does not prevent this. Instead, it appears those switches only prevent the reparse point from appearing in the file listing output, but it does not prevent the command from recursing into those folders.
Instead I had to write a recursive script and do the directory recursion myself. It's a little bit slower on my machine. Instead of around 20 seconds locally, it took about 1 minute. Remotely it took closer to 2 minutes.
Here is the code I used:
EDIT: With all the problems with PowerShell 2.0, PowerShell remoting, and memory usage of the original code, I had to update my code as shown below.
function RecurseFolder($path) {
$files=#()
$directory = #(get-childitem $path -Force -ErrorAction SilentlyContinue | Select FullName,Attributes | Where-Object {$_.Attributes -like "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
foreach ($folder in $directory) { $files+=#(RecurseFolder($folder.FullName)) }
$files+=#(get-childitem $path -Filter "*.pst" -Force -ErrorAction SilentlyContinue | Where-Object {$_.Attributes -notlike "*directory*" -and $_.Attributes -notlike "*reparsepoint*"})
$files
}

If it's a large directory structure and you just need the full path names of the files you should be able to speed that up considerably by using to the legacy dir command instead of Get-ChildItem:
invoke-command -ComputerName MACHINE -ScriptBlock {cmd /c dir c:\*.pst /s /b /a-d /a-l}

Try using a remote session:
$yourLoginName = 'loginname'
$server = 'tagetserver'
$t = New-PSSession $server -Authentication CredSSP -Credential (Get-Credential $yourLoginName)
cls
"$(get-date) - Start"
$r = Invoke-Command -Session $t -ScriptBlock{[System.IO.Directory]::EnumerateFiles('c:\','*.pst','AllDirectories')}
"$(get-date) - Finish"

I faced the same problem.
It is obvious that Powershell has problems with the transfer of arrays through PSRemoting.
It demonstrates this little experiment (UPDATED):
$session = New-PSSession localhost
$arrayLen = 1024000
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Preparing test array {0} elements length..." -f $using:arrayLen)
$global:result = [Byte[]]::new($using:arrayLen)
[System.Random]::new().NextBytes($result)
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer array ({0})" -f $using:arrayLen)
return $result
} | Out-Null
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
Measure-Command{
Invoke-Command -Session $session -ScriptBlock {
Write-Host ("Transfer same array nested in a single object")
return #{array = $result}
}
} |% {Write-Host ("Completed in {0} sec`n" -f $_.TotalSeconds)}
And my output (time in ms):
Preparing test array 1024000 elements length...
Completed in 0.0211385 sec
Transfer array (1024000)
Completed in 48.0192142 sec
Transfer same array nested in a single object
Completed in 0.0990711 sec
As you can see, the array is tranfered more than a minute.
Single objects transfered in a few seconds despite their size

Related

Copy files to remote computer

As part of project requirement, I am preparing a script to copy the files from local computer to remote servers ( with username and password )
I have tried with below ways for files are 27 KB and 50 MB size
i. Using ReadallBytes and WriteAllBytes
this is working for small file 27 KB, where as for 50 MB its taking 100% CPU and taking too much of time
$myfile = [System.IO.File]::ReadAllBytes("C:\Temp\test\a.txt")
$Stat = $null
$session=$null
$session = New-PSSession -computerName $server -credential $user
$Stat = Invoke-Command -Session $session -ArgumentList $myfile -Scriptblock {[System.IO.File]::WriteAllBytes("C:\temp\a.txt", $args)} -ErrorAction Stop
ii. I tried to copy with Copy-Item , but issue is target directory is not mount pointed
$Stat = Invoke-Command -ComputerName $server -ScriptBlock { Copy-Item -Path "C:\Temp\test\a.txt" -Destination "C:\temp\a.txt" -Recurse -Force -PassThru -Verbose } -Credential $user
Struck in both ways, please suggest any other way to achieve without mounting the target folder
Copy-Item -Path "C:\Temp\test\a.txt" -Dest "\\$($server)\c$\temp\a.txt"
use the built-in drive shares to copy it over, you may need to provide creds for this.
you might find this helper function useful to get the remote path correctly.
Function Get-RemotePath($Server,$Path){
"\\$($Server)\$($Path -replace ':','$')"
}
Get-RemotePath -Server "SERVER01" -Path "C:\Temp\File.txt"
\\SERVER01\C$\Temp\File.txt
Why not use WMI top copy the file instead?
It can be asynchronous and is very efficient.
I have a post here which explains it.
Powershell - Copying File to Remote Host and Executing Install exe using WMI

Confusion on Start-Job for File Share

So, this is absolutely whipping me. I have created a script that moves data based on a user's response to an number of questions from one file share to another. What I would like to do is have a background job running that provides a report of all the files being moved prior to the move taking place. As a result, I added this little bit of code that absolutely doesn't gather info from the source file share. It simply provides data from my particular machine. What am I doing wrong?
While ($sourcepath -eq $null) {
$sourcepath= read-host "Enter source file path"
}
Set-Location $sourcepath
Start-job -Scriptblock {Get-childitem -recurse |Out-File
c:\users\john.smith\desktop\shareonfile.txt}
Jobs run in a different process, with their own scope. The working directory won't be inherited. To demonstrate this:
Set-Location $sourcepath
Start-Job -ScriptBlock {
Get-Location
} | Wait-Job | Receive-Job
Get-Job | Remove-Job
You should avoid setting the location anyway, and just pass the path to Get-ChildItem. To do that in a job, define a parameter and pass its value like so:
Start-job -Scriptblock { param($thePath)
Get-childitem -Path $thePath -recurse |
Out-File c:\users\john.smith\desktop\shareonfile.txt
} -ArgumentList $sourcepath

Powershell using Invoke-Command and Get-Childitem not delivering content

I am using Powershell 4.0 on a remote computer (rem_comp) to access another one (loc_comp; Powershell 2.0 installed here) in order to get the number of files listed without folders:
$var1 = 'H:\scripts'
Invoke-Command -Computername loc_comp -scriptblock {(Get-Childitem $var1 -recurse | Where-Object {!$_.PSIsContainer}).count}
However when using $var1 inside -scriptblock, it does not deliver anything (neither any error message).
When using
$var1 = 'H:\scripts'
Invoke-Command -Computername loc_comp -scriptblock {(Get-Childitem $ -recurse | Where-Object {!$_.PSIsContainer}).count}
it works!
Note: Changing var1 from ' to " does not help.
Running the command without Invoke-Command locally faces the same problem.
How to fix this?
To complement CmdrTchort's helpful answer:
PS v3 introduced the special using: scope, which allows direct use of local variables in script blocks sent to remote machines (e.g., $using:var1).
This should work for you, because the machine you're running Invoke-Command on has v4.
$var1 = 'H:\scripts'
Invoke-Command -Computername loc_comp -scriptblock `
{ (Get-Childitem $using:var1 -recurse | Where-Object {!$_.PSIsContainer}).count }
Note that using: only works when Invoke-Command actually targets a remote machine.
When you're using Invoke-command and a script-block , the scriptblock cannot access your params from the outer scope (scoping rules).
You can however, define the params and pass them along with the -Argumentlist
Example:
Invoke-Command -ComputerName "localhost" {param($Param1=$False, $Param2=$False) Write-host "$param1 $param2" } -ArgumentList $False,$True
The following should work for your example:
Invoke-Command -Computername loc_comp -scriptblock {param($var1)(Get-Childitem $var1 -recurse | Where-Object {!$_.PSIsContainer}).count} -ArgumentList $var1

Run Registry File Remotely with PowerShell

I'm using the following script to run test.reg on multiple remote systems:
$computers = Get-Content computers.txt
Invoke-Command -ComputerName $computers -ScriptBlock {
regedit /i /s "\\SERVER\C$\RegistryFiles\test.reg"
}
The script doesn't error, but the registry entry doesn't import on any of the systems.
I know test.reg file is a valid registry file because I copied it over, ran it manually, and the registry key imports. I also made sure PowerShell Remoting is enabled on the remote computers.
Any ideas why the registry key isn't importing?
I found the best way not to mess with issues related to server authentication and cut down on complexity just to pass Reg file as parameter to function.
$regFile = #"
Windows Registry Editor Version 5.00
[HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Tcpip\Parameters]
"MaxUserPort"=dword:00005000
"TcpTimedWaitDelay"=dword:0000001e
"#
Invoke-Command -ComputerName computerName -ScriptBlock {param($regFile) $regFile | out-file $env:temp\a.reg;
reg.exe import $env:temp\a.reg } -ArgumentList $regFile
I posted on some PowerShell forums and finally got this working.
I had to 1) move the $newfile variable inside the loop and 2) comment out the $ in the path stored in the $newfile variable.
For reference, the final script looks like this if anyone wants to use it:
$servers = Get-Content servers.txt
$HostedRegFile = "C:\Scripts\RegistryFiles\test.reg"
foreach ($server in $servers)
{
$newfile = "\\$server\c`$\Downloads\RegistryFiles\test.reg"
New-Item -ErrorAction SilentlyContinue -ItemType directory -Path \\$server\C$\Downloads\RegistryFiles
Copy-Item $HostedRegFile -Destination $newfile
Invoke-Command -ComputerName $server -ScriptBlock {
Start-Process -filepath "C:\windows\regedit.exe" -argumentlist "/s C:\Downloads\RegistryFiles\test.reg"
}
}

Call VBScript With Return Value on Remote Machine With Powershell

I need to call a remote VB script from Powershell, and the VB script needs to run on the remote machine.
I have been using \$computer\root\cimv2:Win32_Process").Create(C:\test.vbs)
This works, however I can't get a return value from the script, just a return value from the win32 process.
I would convert the whole thing to powershell, but can't as I'm connecting to a legacy domain I can't install additional tools on so have to call the remote vbscript
This is an old question but I would like to share my solution. It's the same as the one posted by Ansgar but it's been tested and working fine:
$VNC = '\\share\software\AppName\_Install_Silent.vbs'
$Computer = 'RemoteHost'
$TMP = "\\$Computer\c$\TEMP"
if (!(Test-Path $TMP)) {
New-Item -Path $TMP -ItemType Directory
}
Copy-Item -LiteralPath (Split-Path $VNC -Parent) -Destination $TMP -Container -Recurse -Force -Verbose
$LocalPath = Join-Path 'C:\TEMP' (Join-Path (Split-Path $VNC -Parent | Split-Path -Leaf) (Split-Path $VNC -Leaf))
Invoke-Command -ScriptBlock {cscript.exe $Using:LocalPath} -Computer $Computer
# Restart might be needed of remote host
The difference is that you have to copy the files first to the remote machine to avoid the double hop issue and then you can install it with the $Using variable.
Hope this helps someone.
I'd probably try either remote invocation:
Invoke-Command -ScriptBlock { cscript.exe "C:\test.vbs" } -Computer $computer
or PsExec:
PsExec \\$computer cscript.exe "C:\test.vbs"
Can't test either of them right now, though.