Powershell Remoting Speeding up a Foreach Loop Hosted Exchange - powershell

I have a CSV of email adddresses and Departments that I need to set on Live#edu. The command I currently have looks something like this:
Import-CSV departments.csv | ForEach-Object { Set-User $_.EmailAddress $_.Department }`
The problem is, this operation takes FOREVER.
My first thought is that it would be great to have the ForEach-Object command actually be forwarded over to the remote machine, so that it will only need to create the one pipeline between the two machines, but when I go into the PSSession, there doesn't seem to be any foreach-object available. For reference, How I Import the PSSession is:
Import-PSSession(New-PSSession -ConfigurationName Microsoft.Exchange `
-ConnectionUri 'https://ps.outlook.com/powershell' `
-Credential (Get-Credential) `
-Authentication Basic -AllowRedirection)
Is there a better way that I can import the session to allow ForEach-Object to be remote, or to import an aliased version of the remote foreach-object, perhaps as ForEach-Object-Remote, or perhaps does anybody have something better to suggest to streamline this process?
UPDATE:
A Couple Things I've tried:
Using the -AsJob switch on the implicitly remoted command.
Import-CSV departments.csv | ForEach-Object { Set-User $_.EmailAddress $_.Department -AsJob }
This, unfortunately, doesn't work because there are throttling limits in place that don't allow the additional connections. Worse than that, I don't even know that anything went wrong until I check the results, and find that very few of them actually got changed.
Importing the ForEach-Object under a different name.
Turns out that adding a prefix is easy as putting -Prefix RS in the Import-PSSession Command to have things like the ForEach-Object from the Remote Session become ForEach-RSObject in the local session. Unfortunately, this won't work for me, because the server I'm connecting to does not does not have the Microsoft.Powershell ConfigurationName available to me.
UPDATE 2: The Set-User cmdlet seems to be Microsoft provided for Live#edu administration. Its purpose is to set User attributes. It is not a script or cmdlet that I am able to debug. It doesn't take pipeline input, unfortunately, so that would not be able to fix the issue.
As Far as I can tell, the problem is that it has to construct and tear down a pipeline to the remote machine every time this command runs, rather than being able to reuse it. The remote ForEach idea would have allowed me to offload that loop to avoid having to create all those remote pipelines, while the -asJob would have allowed them to all run in parallel. However, it also caused errors to fail silently, and only a few of the records actually get properly updated.
I suspect at this point that I will not be able to speed up this command, but will have to do whatever I can to limit the amount of data that needs to be changed in a particular run by keeping better track of what I have done before (keeping differential snapshots). Makes the job a bit harder.
EDIT: Start-Automate left a very useful help, unfortunately, neither of them work. It is my feeling at this point that I won't find a way to speed this up until my provider gives access to more powershell cmdlets, or the exchange cmdlets are modified to allow multiple pipelines, neither of which I expect to happen any time soon. I am marking his answer as correct, despite the ultimate result that nothing helps significantly. Thanks, Start-Automate.

You can speed up your script and also avoid trying to make two connections to the server by the use of the foreach statement, instead of Foreach-Object.
$departments = #(import-csv .\departments.csv)
foreach ($user in $departments) {
Set-User $user.EmailAddress $user.Department
}
If you need to batch, you could use the for statement, moving forward in each batch
for ($i =0; $i -lt $departments.Count; $i+=3) {
$jobs = #()
$jobs+= Invoke-Command { Set-User $departments[$i].EmailAddress $departments[$i].Department } -AsJob
$jobs+= Invoke-Command { Set-User $departments[$i + 1].EmailAddress $departments[$i + 1].Department } -AsJob
$jobs+= Invoke-Command { Set-User $departments[$i + 2].EmailAddress $departments[$i + 2].Department } -AsJob
$jobs | Wait-job | Receive-job
}
Hope this helps

Related

[System.IO.Path]::GetTempPath() outputs local temp directory when called through Invoke-Command on a remote machine

I'm running PowerShell commands on a remote machine by the use of Invoke-Command -ComputerName. I'm trying to obtain the path of the temporary directory of the remote machine.
Depending on where I call [System.IO.Path]::GetTempPath() it either outputs the expected remote directory C:\Users\…\AppData\Local\Temp or my local temporary directory C:\temp.
This command is not working as expected:
Invoke-Command -ComputerName MyRemoteMachine -ScriptBlock {
Write-Output ([System.IO.Path]::GetTempPath())
}
# Outputs local directory 'C:\temp'
# Expected remote directory 'C:\Users\…\AppData\Local\Temp'
The problem can be reproduced with other commands than Write-Output, e. g. Join-Path.
Contrary, the following code samples all give the expected output of C:\Users\…\AppData\Local\Temp.
Invoke-Command -ComputerName MyRemoteMachine -ScriptBlock {
[System.IO.Path]::GetTempPath()
}
Invoke-Command -ComputerName MyRemoteMachine -ScriptBlock {
$tmp = [System.IO.Path]::GetTempPath(); Write-Output $tmp
}
Invoke-Command -ComputerName MyRemoteMachine -ScriptBlock {
Start-Sleep 1
Write-Output ([System.IO.Path]::GetTempPath())
}
Invoke-Command -ComputerName MyRemoteMachine -ScriptBlock {
Write-Output ([System.IO.Path]::GetTempPath())
Start-Sleep 1
}
Obviously Start-Sleep isn't a solution, but it seems to indicate some kind of timing problem.
Suspecting that the problem isn't limited to GetTempPath() I tried another user-related .NET API, which also unexpectedly outputs my local folder instead of the remote one:
Invoke-Command -ComputerName MyRemoteMachine -ScriptBlock {
Write-Output ([System.Environment]::GetFolderPath([Environment+SpecialFolder]::MyDocuments))
}
How can I use [System.IO.Path]::GetTempPath() and other .NET API in a PowerShell remote session in a predictable way?
Santiago Squarzon has found the relevant bug report:
GitHub issue #14511
The issue equally affects Enter-PSSession.
While a decision was made to fix the problem, that fix hasn't yet been made as of PowerShell 7.3.1 - and given that the legacy PowerShell edition, Windows PowerShell (versions up to v5.1, the latest and final version) will see security-critical fixes only, the fix will likely never be implemented there.
While the linked bug report talks about the behavior originally having been by (questionable) design, the fact that it only surfaces in very narrow circumstances (see below) implies that at the very least that original design intent's implementation was faulty.
The problem seems to be specific to a script block with the following characteristics:
containing a single statement
that is a cmdlet call (possibly with additional pipeline segments)
whose arguments involve .NET method calls, which are then unexpectedly performed on the caller's side.
Workaround:
Make sure that your remotely executing script block contains more than one statement.
A simple way to add a no-op dummy statement is to use $null++:
# This makes [System.IO.Path]::GetTempPath() *locally* report
# 'C:\temp\'
# *Remotely*, the *original* value should be in effect, even when targeting the
# same machine (given that the env. var. modification is process-local).
$env:TMP = 'C:\temp'
Invoke-Command -ComputerName MyRemoteMachine -ScriptBlock {
Write-Output ([System.IO.Path]::GetTempPath()); $null++ # <- dummy statement.
}
Other workarounds are possible too, such as enclosing the cmdlet call in (...) or inserting a dummy variable assignment
(Write-Output ($unused = [System.IO.Path]::GetTempPath()))
Your Start-Sleep workaround happened to work because by definition it too added another statement; but what that statement is doesn't matter, and there's no timing component to the bug.

Passing active Powershell PSSession connection as argument in Start-Job

I am writing a script which gathers data from Exchange Online concerning mailbox permissions for each mailbox in our organization. To do this I gather all mailbox data into a variable then I use foreach to iterate through each mailbox and check the mailbox permissions applied to it. This takes time when you are working with over 15000 mailboxes.
I would like to use Powershell Jobs to speed this process up by having multiple jobs checking permissions and appending them to a single CSV file. Is there a way to pass an active PSSession into a new job so that the job "shares" the active session of the parent process that spawned the job and does not require a new one to be established?
I could place a New-PSSession call into the function but Microsoft has active session limits in Exchange Online PSSessions so it would limit the number of jobs I could have running at one time to 3. The rest would have to be queued through a while loop. If I can share a single session between multiple jobs I would be limited by computer resources rather than connection restrictions.
Has anyone successfully passed an active PSSession through to a job before?
Edit:
I've been working on using runspaces to try to accomplish this with Boe Prox's PoshRSJobs module. Still having some difficulty getting it to work properly. Doesn't create the CSV or append to it but only if I try to sort out the permissions within the foreach statement. The Write-Output inside the scriptblock only outputs the implicit remoting information too which is odd.
Code is below.
Connect-ToOffice365TenantPSSession
$mailboxes = Get-Mailbox -ResultSize 10 -IncludeInactiveMailbox
$indexCount = 1
foreach ($mailbox in $mailboxes) {
$script = #"
`$cred = Import-Clixml -Path 'C:\Users\Foo\.credentials\StoredLocalCreds.xml'
`$o365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://outlook.office365.com/powershell-liveid/" -Credential `$cred -Authentication Basic -AllowRedirection
Import-PSSession `$o365Session -CommandName #('Get-Mailbox','Get-MailboxPermission')
`$internal_mailbox = `$Using:mailbox
`$mailboxPermissions = `$internal_mailbox | Get-MailboxPermission
foreach (`$permission in (`$mailboxPermissions | Where-Object {`$_.User -match 'tenantName|companyDomain'}))
{
`$userPermissions = `$permission | Select-Object Identity, User, AccessRights
`$permissionObject = [PSCustomObject]#{
"MailboxName" = `$userPermissions.Identity
"MailboxAddress" = `$internal_mailbox.PrimarySmtpAddress
"MailboxType" = `$internal_mailbox.RecipientTypeDetails
"UserWithAccess" = `$userPermissions.User
"AccessRights" = `$userPermissions.AccessRights
}
if (Test-Path 'C:\Scripts\MailboxPermissions.csv') {
`$permissionObject | Export-Csv 'C:\Scripts\MailboxPermissions.csv' -NoTypeInformation -Append
} else {
New-Item -Path 'C:\Scripts\MailboxPermissions.csv'
`$permissionObject | Export-Csv 'C:\Scripts\MailboxPermissions.csv' -NoTypeInformation -Append
}
Write-Output `$permissionObject
}
"#
$scriptBlock = [scriptblock]::Create($script)
$continue = $false
do
{
if ((Get-RSJob | Where-Object {$_.State -eq "Running"}).count -lt 3) {
Start-RSJob -Name "Mailbox $indexCount" -ScriptBlock $scriptBlock
$indexCount++
$continue = $true
}
else {
Start-Sleep 1
}
} while ($continue -eq $false)
}
Get-RSJob | Receive-RSJob
Thanks for the suggestions.
You have specifications here, but you are not showing code and yet, asking for an opinion.
That is, as many would say here, off topic, because the goal here is to assist with code that is not working or the like. So, at this point, have you tried what you are asking, and if so, what happened?
So, IMHO … Yet, since you are here. let's not send you away empty handed.
As per a similar ask here, the accepted answer delivered was...
… you have a current PSSession opened up on your console and then you
are attempting to use that same session in a background job. This will
not work because you are using two different PowerShell processes and
are unable to share the live data between the runspaces as it is
deserialized and loses all of its capabilities. You will need to look
at creating your own PowerShell runspaces if your goal is to share
live variables across multiple PowerShell sessions.
Even with the above, you still have those consumption limits you mention and technically based on your use case, you'd more than likely end up with serious performance issues as well.

Need a Way to Test a user against remote server using Powershell

My only objective is to validate the user account against bunch servers. I am using below commands to do it.
$creds2= Get-Credential
$servers = Get-Content ('C:\Users\vishnuvardhan.chapal\Documents\Widnows Servers success in 139 and 445.txt')
$servers | ForEach-Object {Get-WmiObject Win32_ComputerSystem -ComputerName $_ -Credential $creds2} | Out-GridView
Here, I am encountering two problems.
1) In the Grid view, I am just getting the hostname but without FQDN like shown in below screenshot.
2) Above screen is only for succeeded servers and for failed ones (for the servers, where authentication is failing) I am getting the output in Powershell window like below screen.
Now, my goal is to combine both the output's in at single place. Is it possible? If yes, How to do it? Please shed some light to it.
Apart from above is there any way to test it more easily, i mean a direct command to test the user authentication against a remote server??
FYI...My only goal for this exercise is to validate user authentication not to get some details from a remote computer.
Out-GridView is not a good way to handle these things. Recommended to convert that into JSON or some kind of a format and then parse it in files or however you wish to.
There are multiple ways to check that but error handling will solve your issue:
try
{
$creds2= Get-Credential
$servers = Get-Content ('C:\Users\vishnuvardhan.chapal\Documents\Widnows Servers success in 139 and 445.txt')
$servers
foreach($server in $servers)
{
try
{
Get-WmiObject Win32_ComputerSystem -ComputerName $Server -Credential $creds2
}
catch
{
"Error in accessing the server - $Server with the given credential. Kindly validate."
}
}
}
catch
{
$_.Exception.Message
}
So within the loop also I have added a try catch because if one server is failing, it will proceed with the next server from the list and that will capture the error with server name along with the message.
Hope it helps.

Powershell delete MSMQ remotely

I was wondering if it was possible to delete queues remotely via PowerShell? I have the following script:
cls
[Reflection.Assembly]::LoadWithPartialName("System.Messaging")
$computers = #("comp1","comp2","comp3");
foreach($computer in $computers) {
$messageQueues = [System.Messaging.MessageQueue]::GetPrivateQueuesByMachine($computer);
foreach ($queue in $messageQueues) {
$endpoint = [string]::Format("FormatName:DIRECT=OS:{0}\{1}", $computer, $queue.QueueName);
Write-Host $endpoint
[System.Messaging.MessageQueue]::Delete($endpoint);
}
}
This works fine, if I was running it on the machine whose queues I want to delete however when I run this remotely I get the error:
The specified format name does not support the requested operation. For example, a direct queue format name cannot be deleted.
Any ideas if this can be done?
EDIT
Oddly, I have figured I can remote onto the machine via PowerShell and execute a script block. However, I don't understand the difference between doing:
THIS:
$endpoint = [string]::Format("FormatName:DIRECT=OS:{0}\{1}", $computer, $queue.QueueName);
Invoke-Command -ComputerName $computer -ScriptBlock { [Reflection.Assembly]::LoadWithPartialName("System.Messaging"); [System.Messaging.MessageQueue]::Delete($endpoint) };
AND THIS:
Invoke-Command -ComputerName $computer -ScriptBlock { [Reflection.Assembly]::LoadWithPartialName("System.Messaging"); [System.Messaging.MessageQueue]::Delete("FormatName:DIRECT=OS:MY_SERVER\some.endpoint") };
The value of $endpoint is the same however, for some odd reason it doesn't like the variable approach though both values are identical. I tested this by setting $endpoint then calling delete. I get the error:
Exception calling "Delete" with "1" argument(s): "Invalid value for parameter path."
What I'm trying to say is if I hard code the value as part of the argument it works but assign it to a variable then invoke the method I get an error
For historic purposes if anyone else is experiencing this issue or is wondering how to delete queues remotely then please see below.
How do I delete private queues on a remote computer? It is possible to delete queues remotely. This can be achieved using the command Enable-PSRemoting -Force. Without this, you encounter the issue #JohnBreakWell indicated (see his link to MSDN).
The scope of variables when using Invoke-Command? The problem I found was the variables I declared were simply out of scope (script block was unable to see it). To rectify this, I simply did the following:
The important bit being the argument list and the use of param.
$computers = #("comp1","comp2");
foreach($computer in $computers) {
[Reflection.Assembly]::LoadWithPartialName("System.Messaging");
$messageQueues = [System.Messaging.MessageQueue]::GetPrivateQueuesByMachine($computer);
foreach ($queue in $messageQueues) {
$endpoint = [string]::Format("FormatName:DIRECT=OS:{0}\{1}", $computer, $queue.QueueName);
Enable-PSRemoting -Force
Invoke-Command -ComputerName $computer -ScriptBlock {
param ($computer, $endpoint)
[Reflection.Assembly]::LoadWithPartialName("System.Messaging");
[System.Messaging.MessageQueue]::Delete($endpoint)
}
} -ArgumentList $computer, $endpoint
}
You cannot delete a remote private queue.
You need to perform the operation locally to the queue.
From MQDeleteQueue:
Remarks
(2nd paragraph)
"Private queues registered on a remote computer ... cannot be deleted."
As Dr. Schizo mentioned, you'll need to execute
Enable-PSRemoting -Force
on the remote machine, but then, assuming you're using Server 2012 r2, it's as simple as:
Invoke-Command -ComputerName COMPUTERNAME { Get-MsmqQueue -Name QUEUENAME | Remove-MsmqQueue }

Powershell to shut down a VM

I have a small Powershell script that is used to shut down my virtual machines in event of an extended power outage. It takes a specific VM object and forces a shutdown.
Function DirtyShutdown
{ param([VMware.VimAutomation.ViCore.Impl.V1.Inventory.VirtualMachineImpl]$VM )
$VM | Stop-VM -Confirm:$false
}
I would like to speed up this process using the start-job command to run all these tasks in parallel. I have tried using several variants including the following which I believe to be correct.
Start-Job -InputObject $VM -ScriptBlock{ $input | Shutdown-VMGuest -Confirm:$false }
Based on the Receive-Job output it appears the problem is the snap in in use (added before the above function is called) is not loaded in the context of Start-Job.
What is the correct syntax to make this happen?
While I appreciate the desire to use PowerShell v2's job subsystem for this task, note that vCenter has a built-in job system which you can take advantage of here. Most PowerCLI cmdlets which perform a change to your environment have a RunAsync parameter. To know which ones, run this piece of PowerShell code:
get-help * -parameter runasync
The RunAsync parameter will take your command(s) and queue them up in vCenter. The cmdlet will return a task object and then immediately return control back to your script.
To turn this into an answer in your case, simply append "-runasync" to the end of your Stop-VM command, like so:
$VM | Stop-VM -Confirm:$false -RunAsync
Each time you start a job, PowerShell creates a new runspace. This means a new environment that you may need to initialize, and that includes loading snap-ins and connecting to your VI Server. Start-Job has a parameter that you can use here called InitializationScript. Try something like this:
Start-Job -InitializationScript { Add-PSSnapin VMware.VimAutomation.Core } {
Connect-ViServer myserver
Get-VM foo | Stop-VM
}