I am writing a script which gathers data from Exchange Online concerning mailbox permissions for each mailbox in our organization. To do this I gather all mailbox data into a variable then I use foreach to iterate through each mailbox and check the mailbox permissions applied to it. This takes time when you are working with over 15000 mailboxes.
I would like to use Powershell Jobs to speed this process up by having multiple jobs checking permissions and appending them to a single CSV file. Is there a way to pass an active PSSession into a new job so that the job "shares" the active session of the parent process that spawned the job and does not require a new one to be established?
I could place a New-PSSession call into the function but Microsoft has active session limits in Exchange Online PSSessions so it would limit the number of jobs I could have running at one time to 3. The rest would have to be queued through a while loop. If I can share a single session between multiple jobs I would be limited by computer resources rather than connection restrictions.
Has anyone successfully passed an active PSSession through to a job before?
Edit:
I've been working on using runspaces to try to accomplish this with Boe Prox's PoshRSJobs module. Still having some difficulty getting it to work properly. Doesn't create the CSV or append to it but only if I try to sort out the permissions within the foreach statement. The Write-Output inside the scriptblock only outputs the implicit remoting information too which is odd.
Code is below.
Connect-ToOffice365TenantPSSession
$mailboxes = Get-Mailbox -ResultSize 10 -IncludeInactiveMailbox
$indexCount = 1
foreach ($mailbox in $mailboxes) {
$script = #"
`$cred = Import-Clixml -Path 'C:\Users\Foo\.credentials\StoredLocalCreds.xml'
`$o365Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri "https://outlook.office365.com/powershell-liveid/" -Credential `$cred -Authentication Basic -AllowRedirection
Import-PSSession `$o365Session -CommandName #('Get-Mailbox','Get-MailboxPermission')
`$internal_mailbox = `$Using:mailbox
`$mailboxPermissions = `$internal_mailbox | Get-MailboxPermission
foreach (`$permission in (`$mailboxPermissions | Where-Object {`$_.User -match 'tenantName|companyDomain'}))
{
`$userPermissions = `$permission | Select-Object Identity, User, AccessRights
`$permissionObject = [PSCustomObject]#{
"MailboxName" = `$userPermissions.Identity
"MailboxAddress" = `$internal_mailbox.PrimarySmtpAddress
"MailboxType" = `$internal_mailbox.RecipientTypeDetails
"UserWithAccess" = `$userPermissions.User
"AccessRights" = `$userPermissions.AccessRights
}
if (Test-Path 'C:\Scripts\MailboxPermissions.csv') {
`$permissionObject | Export-Csv 'C:\Scripts\MailboxPermissions.csv' -NoTypeInformation -Append
} else {
New-Item -Path 'C:\Scripts\MailboxPermissions.csv'
`$permissionObject | Export-Csv 'C:\Scripts\MailboxPermissions.csv' -NoTypeInformation -Append
}
Write-Output `$permissionObject
}
"#
$scriptBlock = [scriptblock]::Create($script)
$continue = $false
do
{
if ((Get-RSJob | Where-Object {$_.State -eq "Running"}).count -lt 3) {
Start-RSJob -Name "Mailbox $indexCount" -ScriptBlock $scriptBlock
$indexCount++
$continue = $true
}
else {
Start-Sleep 1
}
} while ($continue -eq $false)
}
Get-RSJob | Receive-RSJob
Thanks for the suggestions.
You have specifications here, but you are not showing code and yet, asking for an opinion.
That is, as many would say here, off topic, because the goal here is to assist with code that is not working or the like. So, at this point, have you tried what you are asking, and if so, what happened?
So, IMHO … Yet, since you are here. let's not send you away empty handed.
As per a similar ask here, the accepted answer delivered was...
… you have a current PSSession opened up on your console and then you
are attempting to use that same session in a background job. This will
not work because you are using two different PowerShell processes and
are unable to share the live data between the runspaces as it is
deserialized and loses all of its capabilities. You will need to look
at creating your own PowerShell runspaces if your goal is to share
live variables across multiple PowerShell sessions.
Even with the above, you still have those consumption limits you mention and technically based on your use case, you'd more than likely end up with serious performance issues as well.
Related
I have HTTP triggered Azure Function App on PowerShell Core stack. Script is parsing the body of the request, assuming that everything is ok, it connects to Exchange Online and then executes 2 cmdlets to create MailContact type of contact. At the end it disconnects from Exchange Online. I have console app that is executing POST requests passing JSON data for one contact in the body. Requests are executed in a for-each loop and after 5th successful requests I get runspace exceeded budget error.
some code snippets from the script
...
try {
Connect-ExchangeOnline -CertificateThumbprint $thumb -AppId $appId -Organization $org -Showbanner:$false -CommandName Get-Contact,Get-MailContact,New-MailContact,Set-Contact,Set-MailContact,Remove-MailContact
New-MailContact -ErrorAction stop #p | Out-Null
Set-Contact -ErrorAction stop #parameters | Out-Null
}
catch {
...
}
finally {
Disconnect-ExchangeOnline -Confirm:$false -InformationAction Ignore -ErrorAction SilentlyContinue
Get-PSSession | Remove-PSSession
}
What I tried (without success):
relaxation for Exchange Online throttling policy (https://www.michev.info/Blog/Post/3205/self-service-powershell-throttling-policy-relaxation-for-exchange-online)
setting different environmental variables (like PSWorkerInProcConcurrencyUpperBound and FUNCTIONS_WORKER_PROCESS_COUNT)
What worked: having additional Function App and then cycle every 5 requests between the two.
Additional information that might help:
PSWorkerInProcConcurrencyUpperBound = 1000
FUNCTIONS_WORKER_PROCESS_COUNT = 10
Function runtime version = ~4
PowerShell Core Version = 7
Platform = 64Bit
Plan type = Consumption (Serverless)
On addition, it takes around 7-8 sec from sending request till I get the response back. Connecting to Exchange Online takes a lot of time.
Any help or hint how to solve the runspace budget error ?
a dirty workaround would be this:
try {
Connect-ExchangeOnline #ConnectExchange
} catch {
Write-Verbose -Verbose ($_.Exception.Message)
$Wait = ($_.Exception.Message) | Select-String ('(?<=for )(.*)(?= seconds)') -AllMatches
$Count = ([int]$Wait.Matches.Value)
Start-Sleep -seconds $Count
Connect-ExchangeOnline #ConnectExchange
}
I have to following part of my script:
$active_processes = (Get-WmiObject -Class Win32_Process | where path -like $path | Select-Object -ExpandProperty Path | split-path -leaf | Select-Object -Unique)
It's working fine but I need to check if the process I get after all the script is running with elevated rights to launch another process with elevated rights if neccesary so it can interact with said process. I don't see any information about elevated rights with Get-WmiObject, I was wondering if I'm missing it or if there's another way to get that information
I don't need to run the powershell script as administrator. What I need is to find ff any executable requires elevated rights when launched and I need to find this information via powershell.
After some research on how windows knows if it needs admin to run an executable, I concluded that there are a couple ways but the most recommended and reliable is reading the executable manifest, so I wrote the following function:
function Get-ManifestFromExe{
Param(
[Parameter(Mandatory=$true,Position=0,ValueFromPipelineByPropertyName=$true)]
[Alias("Path")]
[ValidateScript({Test-Path $_ -IsValid})]
[String]$FullName
)
begin{
$stringStart = '<assembly'
$stringEnd = 'assembly>'
}
process{
$content = Get-Content $FullName -Raw
$indexStart = $content.IndexOf($stringStart)
$content = $content.Substring($indexStart)
$indexEnd = ($content.IndexOf($stringEnd)) + $stringEnd.Length
$content = $content.Substring(0,$indexEnd)
if($content -match "$stringStart(.|\s)+(?=$stringEnd)$stringEnd"){
return [XML]$Matches[0]
}
}
}
function Test-IsAdminRequired{
Param(
[Parameter(Mandatory=$true,Position=0)]
[XML]$xml
)
$value = $xml.assembly.trustInfo.security.requestedPrivileges.requestedExecutionLevel.level
if(-not [String]::IsNullOrEmpty($value)){
return ($value -eq "requireAdministrator" -or $value -eq "highestAvailable")
}else{
Write-Error "Provided xml does not contain requestedExecutionLevel node or level property"
}
}
$exe = '.\Firefox Installer.exe'
Get-ManifestFromExe -Path $exe
Test-IsAdminRequired -xml $exeManifest
It works by extracting the manifest XML from the executable and checking requestedExecutionLevel node's level property, the values accepted for this property are in this page, and quoted here:
asInvoker, requesting no additional permissions. This level requires
no additional trust prompts.
highestAvailable, requesting the highest permissions available to the
parent process.
requireAdministrator, requesting full administrator permissions.
So from this we can conclude that only highestAvailable and requireAdministrator would need admin privileges, so I check those, with that we will be done EXCEPT that some executables I tested (mostly installers) don't require admin to run but instead they prompt the UAC when they ruin their child executable, I don't really see a way to check this.. sorry.
BTW I really enjoyed this question (specially the research), hope this can help you.
SOURCES
What is the difference between "asInvoker" and "highestAvailable" execution levels?
reading an application's manifest file?
https://learn.microsoft.com/en-us/visualstudio/deployment/trustinfo-element-clickonce-application?view=vs-2019#requestedexecutionlevel
It's in the System.Security.Principal classes. This returns $true if the current user is elevated to local Administrator:
(New-Object System.Security.Principal.WindowsPrincipal([System.Security.Principal.WindowsIdentity]::GetCurrent())).IsInRole([System.Security.Principal.WindowsBuiltInRole]::Administrator)
I was wondering if it was possible to delete queues remotely via PowerShell? I have the following script:
cls
[Reflection.Assembly]::LoadWithPartialName("System.Messaging")
$computers = #("comp1","comp2","comp3");
foreach($computer in $computers) {
$messageQueues = [System.Messaging.MessageQueue]::GetPrivateQueuesByMachine($computer);
foreach ($queue in $messageQueues) {
$endpoint = [string]::Format("FormatName:DIRECT=OS:{0}\{1}", $computer, $queue.QueueName);
Write-Host $endpoint
[System.Messaging.MessageQueue]::Delete($endpoint);
}
}
This works fine, if I was running it on the machine whose queues I want to delete however when I run this remotely I get the error:
The specified format name does not support the requested operation. For example, a direct queue format name cannot be deleted.
Any ideas if this can be done?
EDIT
Oddly, I have figured I can remote onto the machine via PowerShell and execute a script block. However, I don't understand the difference between doing:
THIS:
$endpoint = [string]::Format("FormatName:DIRECT=OS:{0}\{1}", $computer, $queue.QueueName);
Invoke-Command -ComputerName $computer -ScriptBlock { [Reflection.Assembly]::LoadWithPartialName("System.Messaging"); [System.Messaging.MessageQueue]::Delete($endpoint) };
AND THIS:
Invoke-Command -ComputerName $computer -ScriptBlock { [Reflection.Assembly]::LoadWithPartialName("System.Messaging"); [System.Messaging.MessageQueue]::Delete("FormatName:DIRECT=OS:MY_SERVER\some.endpoint") };
The value of $endpoint is the same however, for some odd reason it doesn't like the variable approach though both values are identical. I tested this by setting $endpoint then calling delete. I get the error:
Exception calling "Delete" with "1" argument(s): "Invalid value for parameter path."
What I'm trying to say is if I hard code the value as part of the argument it works but assign it to a variable then invoke the method I get an error
For historic purposes if anyone else is experiencing this issue or is wondering how to delete queues remotely then please see below.
How do I delete private queues on a remote computer? It is possible to delete queues remotely. This can be achieved using the command Enable-PSRemoting -Force. Without this, you encounter the issue #JohnBreakWell indicated (see his link to MSDN).
The scope of variables when using Invoke-Command? The problem I found was the variables I declared were simply out of scope (script block was unable to see it). To rectify this, I simply did the following:
The important bit being the argument list and the use of param.
$computers = #("comp1","comp2");
foreach($computer in $computers) {
[Reflection.Assembly]::LoadWithPartialName("System.Messaging");
$messageQueues = [System.Messaging.MessageQueue]::GetPrivateQueuesByMachine($computer);
foreach ($queue in $messageQueues) {
$endpoint = [string]::Format("FormatName:DIRECT=OS:{0}\{1}", $computer, $queue.QueueName);
Enable-PSRemoting -Force
Invoke-Command -ComputerName $computer -ScriptBlock {
param ($computer, $endpoint)
[Reflection.Assembly]::LoadWithPartialName("System.Messaging");
[System.Messaging.MessageQueue]::Delete($endpoint)
}
} -ArgumentList $computer, $endpoint
}
You cannot delete a remote private queue.
You need to perform the operation locally to the queue.
From MQDeleteQueue:
Remarks
(2nd paragraph)
"Private queues registered on a remote computer ... cannot be deleted."
As Dr. Schizo mentioned, you'll need to execute
Enable-PSRemoting -Force
on the remote machine, but then, assuming you're using Server 2012 r2, it's as simple as:
Invoke-Command -ComputerName COMPUTERNAME { Get-MsmqQueue -Name QUEUENAME | Remove-MsmqQueue }
I'm trying to use PowerShell to quickly find the Scheduled Tasks in the root folder of a remote server. I find all sorts of scripts that others have written, but they're either looking at the localhost or on a server in the same domain. I support servers in dozens of domains, so I need some way to pass along credentials.
Here's the meat of my script:
$server = "<computername>"
$schedule = new-object -com("Schedule.Service")
$Schedule.connect($server)
$folder = $schedule.GetFolder("")
$tasks = $folder.GetTasks("")
foreach($task in $tasks) {
if (($task = $Folder.GetTasks(0))) {
$Tasks| ForEach-Object {[array]$results += $_}
$Tasks | Foreach-Object {
New-Object -TypeName PSCustomObject -Property #{
'Name' = $_.name
<etc.>
<etc.>
}
}
}
That code works fine either on my localhost or a server in the same domain as my workstation. In other scripts, I use Get-Credential to create $creds and (in various ways) pass that to the appropriate cmdlet. But with this one, I'm not sure. 'New-Object' doesn't accept a -Credential parameter. I've tried wrapping various parts inside an Invoke-Command scriptblock, since that accepts -Credential, but it fails in various ways. I'm not sure what needs to be wrapped in Invoke-Command--just the new-object? The foreach loop? The entire thing?
Thanks in advance.
When doing the Connect call, you can pass the server, domain, username, and password:
$Schedule.Connect($serverName, $user, $domain, $password);
This should allow you to use that object on the new domain.
MSDN Reference
I have a CSV of email adddresses and Departments that I need to set on Live#edu. The command I currently have looks something like this:
Import-CSV departments.csv | ForEach-Object { Set-User $_.EmailAddress $_.Department }`
The problem is, this operation takes FOREVER.
My first thought is that it would be great to have the ForEach-Object command actually be forwarded over to the remote machine, so that it will only need to create the one pipeline between the two machines, but when I go into the PSSession, there doesn't seem to be any foreach-object available. For reference, How I Import the PSSession is:
Import-PSSession(New-PSSession -ConfigurationName Microsoft.Exchange `
-ConnectionUri 'https://ps.outlook.com/powershell' `
-Credential (Get-Credential) `
-Authentication Basic -AllowRedirection)
Is there a better way that I can import the session to allow ForEach-Object to be remote, or to import an aliased version of the remote foreach-object, perhaps as ForEach-Object-Remote, or perhaps does anybody have something better to suggest to streamline this process?
UPDATE:
A Couple Things I've tried:
Using the -AsJob switch on the implicitly remoted command.
Import-CSV departments.csv | ForEach-Object { Set-User $_.EmailAddress $_.Department -AsJob }
This, unfortunately, doesn't work because there are throttling limits in place that don't allow the additional connections. Worse than that, I don't even know that anything went wrong until I check the results, and find that very few of them actually got changed.
Importing the ForEach-Object under a different name.
Turns out that adding a prefix is easy as putting -Prefix RS in the Import-PSSession Command to have things like the ForEach-Object from the Remote Session become ForEach-RSObject in the local session. Unfortunately, this won't work for me, because the server I'm connecting to does not does not have the Microsoft.Powershell ConfigurationName available to me.
UPDATE 2: The Set-User cmdlet seems to be Microsoft provided for Live#edu administration. Its purpose is to set User attributes. It is not a script or cmdlet that I am able to debug. It doesn't take pipeline input, unfortunately, so that would not be able to fix the issue.
As Far as I can tell, the problem is that it has to construct and tear down a pipeline to the remote machine every time this command runs, rather than being able to reuse it. The remote ForEach idea would have allowed me to offload that loop to avoid having to create all those remote pipelines, while the -asJob would have allowed them to all run in parallel. However, it also caused errors to fail silently, and only a few of the records actually get properly updated.
I suspect at this point that I will not be able to speed up this command, but will have to do whatever I can to limit the amount of data that needs to be changed in a particular run by keeping better track of what I have done before (keeping differential snapshots). Makes the job a bit harder.
EDIT: Start-Automate left a very useful help, unfortunately, neither of them work. It is my feeling at this point that I won't find a way to speed this up until my provider gives access to more powershell cmdlets, or the exchange cmdlets are modified to allow multiple pipelines, neither of which I expect to happen any time soon. I am marking his answer as correct, despite the ultimate result that nothing helps significantly. Thanks, Start-Automate.
You can speed up your script and also avoid trying to make two connections to the server by the use of the foreach statement, instead of Foreach-Object.
$departments = #(import-csv .\departments.csv)
foreach ($user in $departments) {
Set-User $user.EmailAddress $user.Department
}
If you need to batch, you could use the for statement, moving forward in each batch
for ($i =0; $i -lt $departments.Count; $i+=3) {
$jobs = #()
$jobs+= Invoke-Command { Set-User $departments[$i].EmailAddress $departments[$i].Department } -AsJob
$jobs+= Invoke-Command { Set-User $departments[$i + 1].EmailAddress $departments[$i + 1].Department } -AsJob
$jobs+= Invoke-Command { Set-User $departments[$i + 2].EmailAddress $departments[$i + 2].Department } -AsJob
$jobs | Wait-job | Receive-job
}
Hope this helps