I am trying to generate a 1000 Azure VMs(yes I know of the cost) for a large job. This is being done in a PowerShell script (see script below) asynchronously so charges will not be incurred while waiting for all the VMs to spin up.
In the script if Wait-Job and Receive-Job include when the script runs all the requested VMs are created, but if Wait-Job and Receive-Job are commented out all the VMs are not created. It seems random what VMs are getting created.
Can anyone see what I'm doing wrong with this script?
$VMCount = 5
$Location = 'East US'
$Image = 'MyImage'
$AdminPassword = 'XXXXXXXXXX'
$LinuxUser = 'MyUser'
$InstanceSize = 'ExtraSmall' #extra small only for testing
$CloudServiceName = 'NewAzureMachinePrefix' #this is changed each time to something unique
for ($i = 1; $i -le $VMCount; $i++)
{
$jobId = Start-Job -ArgumentList $CloudServiceName$i, $Location, $Image, $AdminPassword, $LinuxUser, $InstanceSize -ScriptBlock {
param($ServiceName, $Loc, $Img, $Password, $User, $Size)
New-AzureVMConfig -Name $ServiceName -InstanceSize $Size -ImageName $Img |
Add-AzureProvisioningConfig -Linux -LinuxUser $User -Password $Password |
Add-AzureDataDisk -CreateNew -DiskSizeInGB 50 -DiskLabel $ServiceName -LUN 0 |
Remove-AzureEndpoint 'SSH' |
Add-AzureEndpoint -Protocol tcp -PublicPort 22 -LocalPort 22 -Name 'SSH' |
Set-AzureSubnet 'MySubnet' |
New-AzureVM -ServiceName $ServiceName -AffinityGroup 'MyGroup' -VNetName 'MyNet'
}
Write-Output $CloudServiceName$i
Wait-Job $jobId
Receive-Job $jobId
}
I figured out what is going on after a few emails from our Microsoft Rep. When creating a new virtual machine in Azure using an Affinity Group and/or Virtual Network an exclusive lock is created. This exclusive lock does not allow more than one request to access the Affinity Group and/or Virtual Network.
Have you tried moving the Wait-job to outside of the for loop? I'm guessing you know that by putting it there you are making it run synchronously.
The following will wait for all jobs:
Get-Job | Wait-Job
Get-Job | Receive-Job
The Receive-Job should give you some clues about why some are not being created.
Related
I need to run parallel Search-Mailbox cmdlets against 100's mailboxes to delete the content but they need to fit certain parameters first like certain CAS protocols enabled and a forwarding address present. I've also parameterised it so I can pass a $maxJobCount int to it so the runner can specify a maximum number of concurrently running jobs to allow so as to account for resources on their machine.
Got the thing working then got to the start-job component which is a pretty simple function.
function _StartJob {
param (
$mailAddress
)
Start-Job -Name $mailAddress -Scriptblock {
Get-EXOMailbox $mailAddress -PropertySets Delivery
}
}
That's returning an error saying I need to run Connect-ExchangeOnline before using the cmdlets which is where I learned script blocks in Start-Job are actually new PowerShell.exe processes so don't inherit modules and session options.
Does anyone know an easier way around this? In an MFA environment, it either means sitting there and pasting the password in a few hundred times or convincing the Change board and Secops dept to let me setup a graph application with delete rights... both painful
Thanks for any advice
You just have to pass in the creds into the block however you want.
$kvCertName = 'Cert'
#I am using azure automation here to get the cert its different for keyvault
$kvCertPFX = Get-AutomationCertificate -Name $kvCertName
$tenantid = 'yourcompany.onmicrosoft.com'
$appid = '00000000-46da-6666-5555-33333cfe77ec'
$startDate = ([datetime]::Today).AddDays(-7)
#Build the script block
$block = {
Param(
$kvCert,
$appID,
$tenantID,
$n,
$startdate
)
$newCertPFX = [System.Security.Cryptography.X509Certificates.X509Certificate2]::new($kvCert)
Connect-ExchangeOnline -Certificate ([System.Security.Cryptography.X509Certificates.X509Certificate2]$newCertPFX) -AppID $appID -Organization $tenantID -ErrorAction Stop
Search-AdminAuditLog -StartDate $startDate.adddays($n) -EndDate $($startDate.AddDays($n) | get-date -Hour 23 -Minute 59 -Second 59) -ExternalAccess:$false -ResultSize 250000
Disconnect-ExchangeOnline -confirm:$false
}
#Remove all jobs created
Get-Job | Remove-Job
#Run All the Parrallel Jobs
$num = 0..6
$kvCert = $kvCertPFX.Export(3)
foreach($n in $num){Start-Job -Scriptblock $Block -ArgumentList #($kvCert,$appID,$tenantid,$n,$startdate)}
#Wait for all jobs to finish.
do {start-sleep 1}
until ($(Get-Job -State Running).count -eq 0)
#Get information from each job.
$adminPowerShellAuditLog = $null
foreach($job in Get-Job){$adminPowerShellAuditLog+= Receive-Job -Id ($job.Id)}
Write-Output $adminPowerShellAuditLog
I have developed a script which does a lot of processing for a front end tool, now I am attempting to have the script run with multiple threads. It interacts a lot with SQL databases, this should not be a problem for multithreading as the database transactions are very short lived, and the queries well optimised.
what is the issue ?
#.\tester.ps1 -servers (1,'Server1',3,1),(2,'Server2',3,1) -output_folder 'C:\temp'
param ([array[]]$servers),$output_folder
for ($i = 0; $i -lt $servers.Count; $i++)
{
$myserverid = $servers[$i][0]
$myservername = $servers[$i][1]
$mylocationid = $servers[$i][2]
$myappid = $servers[$i][3]
write-output " $myserverid and $myservername and $mylocationid and $myappid"
invoke-sqlcmd -ServerInstance "$myservername" -query "select top 10 name from sysobjects" -Database "master"
}
The script file above will gets passed an array of servers and currently it will loop through the array one by one. A way for me to make the process faster is to run the script in parallel /run the script with multiple threads.
Research
I have looked at a technet script on https://gallery.technet.microsoft.com/scriptcenter/Run-a-PowerShell-script-991c8a42
Its not quite the same as my array is not just a list of servers, there will be other parameters sent with it.
What am I after
A way or pointer to make the script be able to run in parallel or an example using the provided script above.
Thanks in advance.
Extending my comment. In PowerShell v5, use Jobs and Workflows for Parallel use cases.
# Example using parallel jobs
$start = Get-Date
# get all hotfixes
$task1 = { Get-Hotfix }
# get all scripts in your profile
$task2 = { Get-Service | Where-Object Status -eq Running }
# parse log file
$task3 = { Get-Content -Path $env:windir\windowsupdate.log | Where-Object { $_ -like '*successfully installed*' } }
# run 2 tasks in the background, and 1 in the foreground task
$job1 = Start-Job -ScriptBlock $task1
$job2 = Start-Job -ScriptBlock $task2
$result3 = Invoke-Command -ScriptBlock $task3
# wait for the remaining tasks to complete (if not done yet)
$null = Wait-Job -Job $job1, $job2
# now they are done, get the results
$result1 = Receive-Job -Job $job1
$result2 = Receive-Job -Job $job2
# discard the jobs
Remove-Job -Job $job1, $job2
$end = Get-Date
# Example, using WorkFlow
workflow Test-WFConnection
{
param
(
[string[]]$Computers
)
foreach -parallel ($computer in $computers)
{
Test-Connection -ComputerName $computer -Count 1 -ErrorAction SilentlyContinue
}
}
Been trying to solve this for a bit and can't seem to figure it out.
I have the following script:
$Servers = Get-Content -Path "C:\Utilities_PowerShell\ServerList.txt"
$IISServiceName1 = 'W3SVC'
$IISServiceName2 = 'IISAdmin'
$IISServiceName3 = 'WAS'
$IISarrService = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3
$IISarrServiceCheck = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 -ErrorAction SilentlyContinue -ErrorVariable NoService
function IISServiceStatus # Checks for status of IIS services
{
param (
$IISServiceName1,
$IISServiceName2,
$IISServiceName3,
$IISarrService,
$IISarrServiceCheck
)
if (Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3)
{
Write-Host "Status of IIS service(s) on $env:ComputerName :"
Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 | Select Name,DisplayName,Status | Format-Table -AutoSize
}
else
{
Write-Host " No IIS service(s) were found..." -foreground "red"
}
}
$Sessions = New-PSSession -ComputerName $Servers
$EndJobs = $Sessions | ForEach-Object {
Invoke-Command -Session $_ -ScriptBlock ${function:IISServiceStatus} -AsJob -ArgumentList $IISServiceName1, $IISServiceName2, $IISServiceName3, $IISarrService, $IISarrServiceCheck | Wait-Job | Receive-Job
Write-Host " "
}
Whenever I run it, all I get is the output of:
Status of IIS service(s) on *PC* :
If I run the function outside of a loop/invoke-command, the results are absolutely perfect. What is wrong with my remote loop?
I've tried putting the variables inside the function, I've tried running invoke-command without the argument list, etc.
Update: 3/17/16
Turns out...if I run my actual script as is, the result of $EndJobs is weird in that it outputs ALL services in one table and then the three IIS services in another table. This would explain why when I run my invoke-command (stopIIS) scriptblock...I had to reboot the whole server because it took all of the services down.
These functions run PERFECTLY when not run via remote/invoke-command.
What the heck...invoke-command is seriously screwing with my stuff!
Anyone have any ideas/tips on how I can run my local script (which works 100%) on a set of servers from a text file without weird issues like this? Is invoke-command the only way?
do you have the same problem if you wrap it all into the script block like this?
$Servers = Get-Content 'C:\Utilities_PowerShell\ServerList.txt'
$Sessions = New-PSSession -ComputerName $Servers
$EndJobs = $Sessions | ForEach-Object {
Invoke-Command -Session $_ -ScriptBlock {
$IISServiceName1 = 'W3SVC'
$IISServiceName2 = 'IISAdmin'
$IISServiceName3 = 'WAS'
$IISarrService = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3
$IISarrServiceCheck = Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 -ErrorAction SilentlyContinue -ErrorVariable NoService
function IISServiceStatus { # Checks for status of IIS services
param (
$IISServiceName1,
$IISServiceName2,
$IISServiceName3,
$IISarrService,
$IISarrServiceCheck
)
if (Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3) {
Write-Host "Status of IIS service(s) on $env:ComputerName :"
Get-Service -Name $IISServiceName1,$IISServiceName2,$IISServiceName3 | Select Name,DisplayName,Status | Format-Table -AutoSize
} else {
Write-Host ' No IIS service(s) were found...' -ForegroundColor Red
}
}
IISServiceStatus $IISServiceName1 $IISServiceName2 $IISServiceName3 $IISarrService $IISarrServiceCheck
} -AsJob | Wait-Job | Receive-Job
Write-Host ' '
}
$EndJobs
I'm having a similar issue. I'm using credssp to test 2nd hop auth for an automation for shutting down a production environment cleanly. My script has 3 sections; session setup, the invoke, session teardown. If I run each piece separately, I get output. If I run the whole script, I get blank lines matching the amount of output I get when I run them separately... there's nothing fancy in my invoke (backtick line continuation - I prefer Python's formatting paradigm better than Powershell/C#):
Invoke-Command `
-Session $workingSession `
-ScriptBlock {
get-service *spool* -ComputerName server01
}
Overview
Looking to call a Powershell script that takes in an argument, runs each job in the background, and shows me the verbose output.
Problem I am running into
The script appears to run, but I want to verify this for sure by streaming the results of the background jobs as they are running.
Code
###StartServerUpdates.ps1 Script###
#get list of servers to update from text file and store in array
$servers=get-content c:\serverstoupdate.txt
#run all jobs, using multi-threading, in background
ForEach($server in $servers){
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server
}
#Wait for all jobs
Get-Job | Wait-Job
#Get all job results
Get-Job | Receive-Job
What I am currently seeing:
Id Name State HasMoreData Location Command
-- ---- ----- ----------- -------- -------
23 Job23 Running True localhost #patch server ...
25 Job25 Running True localhost #patch server ...
What I want to see:
Searching for approved updates ...
Update Found: Security Update for Windows Server 2003 (KB2807986)
Update Found: Windows Malicious Software Removal Tool - March 2013 (KB890830)
Download complete. Installing updates ...
The system must be rebooted to complete installation.
cscript exited on "myServer" with error code 3.
Reboot required...
Waiting for server to reboot (35)
Searching for approved updates ...
There are no updates to install.
cscript exited on "myServer" with error code 2.
Servername "myServer" is fully patched after 2 loops
I want to be able to see the output or store that somewhere so I can refer back to be sure the script ran and see which servers rebooted, etc.
Conclusion:
In the past, I ran the script and it went through updating the servers one at a time and gave me the output I wanted, but when I started doing more servers - this task took too long, which is why I am trying to use background jobs with "Start-Job".
Can anyone help me figure this out, please?
You may take a look at the module SplitPipeline.
It it specifically designed for such tasks. The working demo code is:
# import the module (not necessary in PS V3)
Import-Module SplitPipeline
# some servers (from 1 to 10 for the test)
$servers = 1..10
# process servers by parallel pipelines and output results immediately
$servers | Split-Pipeline {process{"processing server $_"; sleep 1}} -Load 1, 1
For your task replace "processing server $_"; sleep 1 (simulates a slow job) with a call to your script and use the variable $_ as input, the current server.
If each job is not processor intensive then increase the parameter Count (the default is processor count) in order to improve performance.
Not a new question but I feel it is missing an answer including Powershell using workflows and its parallel possibilities, from powershell version 3. Which is less code and maybe more understandable than starting and waiting for jobs, which of course works good as well.
I have two files: TheScript.ps1 which coordinates the servers and BackgroundJob.ps1 which does some kind of check. They need to be in the same directory.
The Write-Output in the background job file writes to the same stream you see when starting TheScript.ps1.
TheScript.ps1:
workflow parallelCheckServer {
param ($Servers)
foreach -parallel($Server in $Servers)
{
Invoke-Expression -Command ".\BackgroundJob.ps1 -Server $Server"
}
}
parallelCheckServer -Servers #("host1.com", "host2.com", "host3.com")
Write-Output "Done with all servers."
BackgroundJob.ps1 (for example):
param (
[Parameter(Mandatory=$true)] [string] $server
)
Write-Host "[$server]`t Processing server $server"
Start-Sleep -Seconds 5
So when starting the TheScript.ps1 it will write "Processing server" 3 times but it will not wait for 15 seconds but instead 5 because they are run in parallel.
[host3.com] Processing server host3.com
[host2.com] Processing server host2.com
[host1.com] Processing server host1.com
Done with all servers.
In your ForEach loop you'll want to grab the output generated by the Jobs already running.
Example Not Tested
$sb = {
"Starting Job on $($args[0])"
#Do something
"$($args[0]) => Do something completed successfully"
"$($args[0]) => Now for something completely different"
"Ending Job on $($args[0])"
}
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | Receive-Job
}
Now if you do this all your results will be mixed. You might want to put a stamp on your verbose output to tell which output came from.
Or
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
}
while((Get-Job -State Running).count){
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
start-sleep -seconds 1
}
It will show all the output as soon as a job is finished. Without being mixed up.
If you're wanting to multiple jobs in-progress, you'll probably want to massage the output to help keep what output goes with which job straight on the console.
$BGList = 'Black','Green','DarkBlue','DarkCyan','Red','DarkGreen'
$JobHash = #{};$ColorHash = #{};$i=0
ForEach($server in $servers)
{
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server |
foreach {
$ColorHash[$_.ID] = $BGList[$i++]
$JobHash[$_.ID] = $Server
}
}
While ((Get-Job).State -match 'Running')
{
foreach ($Job in Get-Job | where {$_.HasMoreData})
{
[System.Console]::BackgroundColor = $ColorHash[$Job.ID]
Write-Host $JobHash[$Job.ID] -ForegroundColor Black -BackgroundColor White
Receive-Job $Job
}
Start-Sleep -Seconds 5
}
[System.Console]::BackgroundColor = 'Black'
You can get the results by doing something like this after all the jobs have been received:
$array=#()
Get-Job -Name * | where{$array+=$_.ChildJobs.output}
.ChildJobs.output will have anything that was returned in each job.
function OutputJoblogs {
[CmdletBinding(DefaultParameterSetName='Name')]
Param
(
[Parameter(Mandatory=$true, Position=0)]
[System.Management.Automation.Job] $job,
[Parameter(Mandatory=$true, Position=1)]
[string] $logFolder,
[Parameter(Mandatory=$true, Position=2)]
[string] $logTimeStamp
)
#Output All logs
while ($job.sate -eq "Running" -or $job.HasMoreData){
start-sleep -Seconds 1
foreach($remotejob in $job.ChildJobs){
if($remotejob.HasMoreData){
$output=(Receive-Job $remotejob)
if($output -gt 0){
$remotejob.location +": "+ (($output) | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".txt"))
}
}
}
}
#Output Errors
foreach($remotejob in $job.ChildJobs){
if($remotejob.Error.Count -gt0){$remotejob.location +": "}
foreach($myerr in $remotejob.Error){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
if($remotejob.JobStateInfo.Reason.ErrorRecord.Count -gt 0){$remotejob.location +": "}
foreach($myerr in $remotejob.JobStateInfo.Reason.ErrorRecord){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
}
}
#example of usage
$logfileDate="$((Get-Date).ToString('yyyy-MM-dd-HH.mm.ss'))"
$job = Invoke-Command -ComputerName "servername1","servername2" -ScriptBlock {
for ($i=1; $i -le 5; $i++) {
$i+"`n";
if($i -gt 2){
write-error "Bad thing happened"};
if($i -eq 4){
throw "Super Bad thing happened"
};
start-sleep -Seconds 1
}
} -asjob
OutputJoblogs -Job $job -logFolder "$PSScriptRoot\logs" -logTimeStamp $logfileDate
I run a script which performs many WMI-querys - but the cmdlet hangs if the server doesn't answer..
Is there any way I can make this (or any other cmndlet for that matter) timeout and exit if X seconds has passed?
Edit
Thanks to a tip from mjolinor the solution is to run this as -asjob and set a timeout in a while loop. But this is run from within a job already (started with Start-Job). So how do I know I am controlling the correct job?
This is my code from inside my already started job:
Get-WmiObject Win32_Service -ComputerName $server -AsJob
$Complete = Get-date
While (Get-Job -State Running){
If ($(New-TimeSpan $Complete $(Get-Date)).totalseconds -ge 5) {
echo "five seconds has passed, removing"
Get-Job | Remove-Job -Force
}
echo "still running"
Start-Sleep -Seconds 3
}
PS: My jobs started with Start-Jobs are already taken care of..
You could try the get-wmiCustom function, posted here. Wouldn't it be nice if get-wmiObject had a timeout parameter? Let's upvote this thing.
I've modified Daniel Muscetta's Get-WmiCustom to also support passing credentials.
I know this post is a little old, hopefully this helps someone else.
# Define modified custom get-wmiobject for timeout with credential from http://blogs.msdn.com/b/dmuscett/archive/2009/05/27/get_2d00_wmicustom.aspx
Function Get-WmiCustom([string]$Class,[string]$ComputerName,[string]$Namespace = "root\cimv2",[int]$Timeout=15, [pscredential]$Credential)
{
$ConnectionOptions = new-object System.Management.ConnectionOptions
$EnumerationOptions = new-object System.Management.EnumerationOptions
if($Credential){
$ConnectionOptions.Username = $Credential.UserName;
$ConnectionOptions.SecurePassword = $Credential.Password;
}
$timeoutseconds = new-timespan -seconds $timeout
$EnumerationOptions.set_timeout($timeoutseconds)
$assembledpath = "\\$Computername\$Namespace"
#write-host $assembledpath -foregroundcolor yellow
$Scope = new-object System.Management.ManagementScope $assembledpath, $ConnectionOptions
$Scope.Connect()
$querystring = "SELECT * FROM " + $class
#write-host $querystring
$query = new-object System.Management.ObjectQuery $querystring
$searcher = new-object System.Management.ManagementObjectSearcher
$searcher.set_options($EnumerationOptions)
$searcher.Query = $querystring
$searcher.Scope = $Scope
trap { $_ } $result = $searcher.get()
return $result
}
Glad my Get-WmiCustom function here http://blogs.msdn.com/b/dmuscett/archive/2009/05/27/get_2d00_wmicustom.aspx is useful.
when creating the job using get-wmiobject assign that job to a variable, then that variable can be piped into get-job for status or receive-job for results
$ThisJob = start-job -scriptblock {param ($Target) Get-WmiObject -Class Win32_Service -ComputerName $Target -AsJob} -ArgumentList $server
$Timer = [System.Diagnostics.Stopwatch]::StartNew()
While ($ThisJob | Get-Job | where {$_.State -imatch "Running"}){
If ($Timer.Elapsed.Seconds -ge 5) {
echo "five seconds has passed, removing"
$ThisJob | Get-Job | Remove-Job -Force
} # end if
echo "still running"
Start-Sleep -Seconds 3
} # end while
$Results = $ThisJob | where {$_.State -inotmatch "failed"} | receive-job
$Timer.Stop | out-null
The only two solutions I've seen for this problem are:
Run the queries as background jobs and put a timer on them, then stop/remove the jobs that run too long.
Fix your servers.
In addition to what has been said, not a bullet proof solution but consider pinging your servers first (Test-Connection), it can speed up execution time in case you have no responding machines.