I need to run parallel Search-Mailbox cmdlets against 100's mailboxes to delete the content but they need to fit certain parameters first like certain CAS protocols enabled and a forwarding address present. I've also parameterised it so I can pass a $maxJobCount int to it so the runner can specify a maximum number of concurrently running jobs to allow so as to account for resources on their machine.
Got the thing working then got to the start-job component which is a pretty simple function.
function _StartJob {
param (
$mailAddress
)
Start-Job -Name $mailAddress -Scriptblock {
Get-EXOMailbox $mailAddress -PropertySets Delivery
}
}
That's returning an error saying I need to run Connect-ExchangeOnline before using the cmdlets which is where I learned script blocks in Start-Job are actually new PowerShell.exe processes so don't inherit modules and session options.
Does anyone know an easier way around this? In an MFA environment, it either means sitting there and pasting the password in a few hundred times or convincing the Change board and Secops dept to let me setup a graph application with delete rights... both painful
Thanks for any advice
You just have to pass in the creds into the block however you want.
$kvCertName = 'Cert'
#I am using azure automation here to get the cert its different for keyvault
$kvCertPFX = Get-AutomationCertificate -Name $kvCertName
$tenantid = 'yourcompany.onmicrosoft.com'
$appid = '00000000-46da-6666-5555-33333cfe77ec'
$startDate = ([datetime]::Today).AddDays(-7)
#Build the script block
$block = {
Param(
$kvCert,
$appID,
$tenantID,
$n,
$startdate
)
$newCertPFX = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new($kvCert)
Connect-ExchangeOnline -Certificate ([System.Security.Cryptography.X509Certificates.X509Certificate2]$newCertPFX) -AppID $appID -Organization $tenantID -ErrorAction Stop
Search-AdminAuditLog -StartDate $startDate.adddays($n) -EndDate $($startDate.AddDays($n) | get-date -Hour 23 -Minute 59 -Second 59) -ExternalAccess:$false -ResultSize 250000
Disconnect-ExchangeOnline -confirm:$false
}
#Remove all jobs created
Get-Job | Remove-Job
#Run All the Parrallel Jobs
$num = 0..6
$kvCert = $kvCertPFX.Export(3)
foreach($n in $num){Start-Job -Scriptblock $Block -ArgumentList #($kvCert,$appID,$tenantid,$n,$startdate)}
#Wait for all jobs to finish.
do {start-sleep 1}
until ($(Get-Job -State Running).count -eq 0)
#Get information from each job.
$adminPowerShellAuditLog = $null
foreach($job in Get-Job){$adminPowerShellAuditLog+= Receive-Job -Id ($job.Id)}
Write-Output $adminPowerShellAuditLog
Related
I have developed a script which does a lot of processing for a front end tool, now I am attempting to have the script run with multiple threads. It interacts a lot with SQL databases, this should not be a problem for multithreading as the database transactions are very short lived, and the queries well optimised.
what is the issue ?
#.\tester.ps1 -servers (1,'Server1',3,1),(2,'Server2',3,1) -output_folder 'C:\temp'
param ([array[]]$servers),$output_folder
for ($i = 0; $i -lt $servers.Count; $i++)
{
$myserverid = $servers[$i][0]
$myservername = $servers[$i][1]
$mylocationid = $servers[$i][2]
$myappid = $servers[$i][3]
write-output " $myserverid and $myservername and $mylocationid and $myappid"
invoke-sqlcmd -ServerInstance "$myservername" -query "select top 10 name from sysobjects" -Database "master"
}
The script file above will gets passed an array of servers and currently it will loop through the array one by one. A way for me to make the process faster is to run the script in parallel /run the script with multiple threads.
Research
I have looked at a technet script on https://gallery.technet.microsoft.com/scriptcenter/Run-a-PowerShell-script-991c8a42
Its not quite the same as my array is not just a list of servers, there will be other parameters sent with it.
What am I after
A way or pointer to make the script be able to run in parallel or an example using the provided script above.
Thanks in advance.
Extending my comment. In PowerShell v5, use Jobs and Workflows for Parallel use cases.
# Example using parallel jobs
$start = Get-Date
# get all hotfixes
$task1 = { Get-Hotfix }
# get all scripts in your profile
$task2 = { Get-Service | Where-Object Status -eq Running }
# parse log file
$task3 = { Get-Content -Path $env:windir\windowsupdate.log | Where-Object { $_ -like '*successfully installed*' } }
# run 2 tasks in the background, and 1 in the foreground task
$job1 = Start-Job -ScriptBlock $task1
$job2 = Start-Job -ScriptBlock $task2
$result3 = Invoke-Command -ScriptBlock $task3
# wait for the remaining tasks to complete (if not done yet)
$null = Wait-Job -Job $job1, $job2
# now they are done, get the results
$result1 = Receive-Job -Job $job1
$result2 = Receive-Job -Job $job2
# discard the jobs
Remove-Job -Job $job1, $job2
$end = Get-Date
# Example, using WorkFlow
workflow Test-WFConnection
{
param
(
[string[]]$Computers
)
foreach -parallel ($computer in $computers)
{
Test-Connection -ComputerName $computer -Count 1 -ErrorAction SilentlyContinue
}
}
I am trying to delete every file in a SharePoint list. My org has retention turned on so I can't just delete the entire list, but must remove every folder/file. My issue is around the connection itself when used with Start-Job.
It's painfully slow, so I wanted to spin up batches of 10+ jobs to delete simultaneously and reuse the connection, but there is an issue passing the connection as an argument because it becomes deserialized. I found this post with the exact same issue and no solution.
If I "workaround" it by connecting each Start-Job, I get throttled by SharePoint online.
function Empty-PnPFiles($SPSiteUrl, $RelativeURL)
{
$connection = Connect-PnPOnline -URL $SPSiteUrl -UseWebLogin -ReturnConnection
# Get All files in the folder
$Files = Get-PnPFolderItem -FolderSiteRelativeUrl $FolderSiteRelativeURL -ItemType File
# Delete all files in the Folder
$n = 0
$Jobs = #()
ForEach ($File in $Files)
{
$n++
Write-Host "Creating job to delete '$($File.ServerRelativeURL)'"
#Delete File
$Jobs += Start-Job -ArgumentList #($connection, $File) -ScriptBlock {
$LocalConnection = $args[0]
# $LocalConnection = Connect-PnPOnline -URL <My SP URL> -UseWebLogin -ReturnConnection
$LocalFile = $args[1]
Remove-PnPFile -ServerRelativeUrl $LocalFile.ServerRelativeURL -Connection $LocalConnection -Force -Recycle
}
# Do in batches. Every 10 Jobs, wait for completion
if ($n % 10 -eq 0)
{
Write-Host "Waiting for batch $n ($($Files.Count)) to complete before next batch" -ForegroundColor Green
$Jobs | Wait-Job | Receive-Job
$Jobs = #()
}
}
# If there are left over jobs, wait for them
if ($Jobs)
{
$Jobs | Wait-Job | Receive-Job
}
}
$SiteURL = "https://<MySiteCollection>.sharepoint.com/sites/<MySite>"
$ListName = "TempDelete"
Empty-PnPFiles -SPSiteUrL $SiteURL -RelativeURL "/TempDelete" # <My Folder to Delete all files>
The error I get is:
Cannot bind parameter 'Connection'. Cannot convert the "PnP.PowerShell.Commands.Base.PnPConnection" value of type "Deserialized.PnP.PowerShell.Commands.Base.PnPConnection" to type "PnP.PowerShell.Commands.Base.PnPConnection".
How can I pass the connection to the script block without the serialization error? Or is there a better way to bulk-delete files from SPO using PowerShell? I have to use PowerShell because it's the only tool available to me currently.
Use Invoke-Command instead of Start-Job
I tried to run few Azure cmdlets in background (i tried it by switch parameter "-asJob" and by with Start-Job cmdlet), i push cmdlet operations to the hashtable with certian keys and then iterate running jobs.
The problem is that all cmdlet change state to "Completed" in the same time, even if some job in hashtable actually ended by "Completed".
Feels like that all jobs end with last ending job.
Below i write some example code with problem
$vmssInstances = Get-AzVmssVM -ResourceGroupName $(agentPoolName) -VMScaleSetName $(agentPoolName)
$groupedVmssInstancesByComputerName = #{}
foreach($vmssInstance in $vmssInstances)
{
$groupedVmssInstancesByComputerName[$vmssInstance.OsProfile.ComputerName] = #{vmmsInstance = $vmssInstance; agents = #(); isFullyUpdated = $false}
}
foreach($key in $groupedVmssInstancesByComputerName.Keys)
{
$vmssInstance = $groupedVmssInstancesByComputerName[$key]["vmmsInstance"]
Write-Host "Trying to reimage instance $($vmssInstance.InstanceId)..."
$groupedVmssInstancesByComputerName[$key]["reimageOperation"] = Set-AzVmssVM -Reimage -InstanceId $vmssInstance.InstanceId -ResourceGroupName $(agentPoolName) -VMScaleSetName $(agentPoolName) -AsJob
}
while($true)
{
Get-Job
Start-Sleep -Seconds 10
}
I cant understand what is going on.
Maybe i dont know some features of powershell.
Please help me, guyes!
Try using Wait-Job - Suppresses the command prompt until one or all of the PowerShell background jobs running in the session are completed.
Get-Job | Wait-Job
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.core/wait-job?view=powershell-6
I have used powershell for 4 months now and learned some basic stuff e.g. how to loop csv, create functions, export etc. However I am trying to get more complex with it and started building a Leavers script for the company I am currently working for. So far so good I managed to compile a working script where I automatically: import an initial CSV containing usernames then loop them and create folder on the target dest., next comes the mailbox export (where I need help actually and the purpose of this post) and all fine with that but the problem is it cycles everyone from the CSV and pushes the Exchange to export all at once. I don't like that because it overutilize the Server and I saw larger mailboxes failed so I decided to try and build a Queue where Mailboxes are getting exported one by one.
This is what I am using currently where it pushes everyone at once:
$dest = "%\Desktop\#SCRIPTS\Modules\Exports\Temp\Target.csv"
$targetStorage = "\\SERVER\Targetfolders"
$Connection = "http://Exchange-Server.companydomain.com/PowerShell/"
Set-ExecutionPolicy RemoteSigned
$UserCredential = Get-Credential
$Session = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri $Connection -Authentication Kerberos -Credential $UserCredential
Import-PSSession $Session
#########################################################################
$mails = Import-Csv $dest
ForEach ($mail in $mails)
{
New-MailboxExportRequest -Mailbox "$($mail.Name)#CompanyDomain.com" -FilePath "$($targetStorage)\$($mail.Name)\$($mail.Name).pst" | Out-Null
}
And this is something I came up with yesterday:
$dest = "%\Desktop\#SCRIPTS\Modules\Exports\Temp\Target.csv"
$targetStorage = "\\SERVER\Targetfolders"
$Connection = "http://Exchange-Server.companydomain.com/PowerShell/"
$mails = Import-Csv $dest
ForEach ($mail in $mails) {
Get-MailboxExportRequest -Mailbox $mail.Name
if (Get-MailboxExportRequest -Status InProgress) {
Write-Host "Mailboxes are still exporting, do not cancel the script!!"
}
else { Get-MailboxExportRequestStatistics -Identity $mail.Name | **Some-Cmdlet** -Like 'PercentComplete' -eq '100' | New-MailboxExportRequest -Mailbox $mail.Name -FilePath "$($synStorage)\$($mail.Name)\$($mail.Name).pst"}
}
On this script I added "somecmdlet" -Like 'PercentComplete' -eq '100', where the 100 represents the percentage of the export from Get-MailboxExportRequestStatistics, so when the previous export has reached 100% execute the next export from the CSV.
Any thoughts which cmdlet I could use to "tag" the PercentComplete guys ?
Name StatusDetail SourceAlias PercentComplete
---- ------------ ----------- ---------------
MailboxExport Completed Dummy.User 100
Sorry for the long post but I tried to make it as clear as possible :)
Thanks!
==========================
Edit as per Xiakit's Advice:==========================
$mails = Import-Csv $dest
ForEach ($mail in $mails) {
Get-MailboxExportRequest -Mailbox $mail.Name
if (Get-MailboxExportRequest -Status InProgress) {
Write-Host "Mailboxes are still exporting, do not cancel the script!!"
}
else {
New-MailboxExportRequest -Mailbox $mail.Name -FilePath "$($synStorage)\$($mail.Name)\$($mail.Name).pst"
while($requeststatus.status -notlike "Completed"){
Start-Sleep -seconds 10
$RequestStatus = Get-MailboxExportRequest -Identity $mail.Name
}
}
}
However the script ends and never loops :( the output:
Mailboxes are still exporting, do not cancel the script!!
Name Mailbox Status
---- ------- ------
MailboxExport ommited/Enterprise/Users/Vlatko Completed
MailboxExport1 ommited/Enterprise/Users/Vlatko Completed
MailboxExport2 ommited/Enterprise/Users/Vlatko InProgress
MailboxExport omitted/Enterprise/Users/Dummy User Queued
MailboxExport omitted/Enterprise/Users/Dummy User Queued
MailboxExport omitted/Enterprise/Users/Dummy User Queued
Mailboxes are still exporting, do not cancel the script!!
There is also a Get-MailboxExportRequest, you can check with it if a request is completed:
while($requeststatus.status -notlike "Completed"){
Start-Sleep -seconds 10
$RequestStatus = Get-MailboxImportRequest -Identity "whatever"
}
This will loop as long as your request is not completed.
To cut the saga at the end I asked for TechNet for help as well since of course they are Microsoft and they should know the answer to every Powershell problem :) Bigh thanks to Xiakit for the effort to resolve my problem (on which I am still keen on working that out since I have started it and I want to finish it) however here is the solution in queuing Mailbox Exports:
$mails = Import-Csv $dest
ForEach ($mail in $mails) {
New-MailboxExportRequest -Mailbox $mail.Name -FilePath "$($synStorage)\$($mail.Name)\$($mail.Name).pst"
do {
Write-Host "Queued"
Start-Sleep -s 5
} while(Get-MailboxExportRequest -Mailbox $mail.Name -Status Queued)
do {
Write-Host "InProgress"
Start-Sleep -s 5
} while(Get-MailboxExportRequest -Mailbox $mail.Name -Status InProgress)
If(Get-MailboxExportRequest -Mailbox $mail.Name -Status Complete){
Write-Host "A export request complete"
Get-MailboxExportRequest | Remove-MailboxExportRequest -Confirm:$false
}
If(Get-MailboxExportRequest -Mailbox $mail.Name -Status Failed){
Write-Host "A error occur"
}
}
Hope this will come in handy to other Admins trying to automate leaver processes in their company for multiple users!!
Thanks for the support!
Vlatko
I am trying to generate a 1000 Azure VMs(yes I know of the cost) for a large job. This is being done in a PowerShell script (see script below) asynchronously so charges will not be incurred while waiting for all the VMs to spin up.
In the script if Wait-Job and Receive-Job include when the script runs all the requested VMs are created, but if Wait-Job and Receive-Job are commented out all the VMs are not created. It seems random what VMs are getting created.
Can anyone see what I'm doing wrong with this script?
$VMCount = 5
$Location = 'East US'
$Image = 'MyImage'
$AdminPassword = 'XXXXXXXXXX'
$LinuxUser = 'MyUser'
$InstanceSize = 'ExtraSmall' #extra small only for testing
$CloudServiceName = 'NewAzureMachinePrefix' #this is changed each time to something unique
for ($i = 1; $i -le $VMCount; $i++)
{
$jobId = Start-Job -ArgumentList $CloudServiceName$i, $Location, $Image, $AdminPassword, $LinuxUser, $InstanceSize -ScriptBlock {
param($ServiceName, $Loc, $Img, $Password, $User, $Size)
New-AzureVMConfig -Name $ServiceName -InstanceSize $Size -ImageName $Img |
Add-AzureProvisioningConfig -Linux -LinuxUser $User -Password $Password |
Add-AzureDataDisk -CreateNew -DiskSizeInGB 50 -DiskLabel $ServiceName -LUN 0 |
Remove-AzureEndpoint 'SSH' |
Add-AzureEndpoint -Protocol tcp -PublicPort 22 -LocalPort 22 -Name 'SSH' |
Set-AzureSubnet 'MySubnet' |
New-AzureVM -ServiceName $ServiceName -AffinityGroup 'MyGroup' -VNetName 'MyNet'
}
Write-Output $CloudServiceName$i
Wait-Job $jobId
Receive-Job $jobId
}
I figured out what is going on after a few emails from our Microsoft Rep. When creating a new virtual machine in Azure using an Affinity Group and/or Virtual Network an exclusive lock is created. This exclusive lock does not allow more than one request to access the Affinity Group and/or Virtual Network.
Have you tried moving the Wait-job to outside of the for loop? I'm guessing you know that by putting it there you are making it run synchronously.
The following will wait for all jobs:
Get-Job | Wait-Job
Get-Job | Receive-Job
The Receive-Job should give you some clues about why some are not being created.