If the Pattern "Idle" is found at the immediate run of the script - it successfully sends the email. The problem is it should be able to keep looking within the while($true) loop with the start-sleep interval. This is not happening - it will wait for 60 minutes and then exit - even when the pattern "Idle" was written as the last line.
do I need a while loop within the Start-Job? I tried this code using a Wait with no luck: Get-Content $file -Tail 1 -Wait | Select-string -Pattern "Idle" -Quiet
$job = Start-Job {
# Note: $file should be the absolute path of your file
Get-Content $File -Raw | Select-string -Pattern "Idle" -Quiet
}
while($true)
{
# if the job has completed
if($job.State -eq 'Completed')
{
$result = $job|Receive-Job
# if result is True
if($result)
{
$elapsedTime.Stop()
$duration = $elapsedTime.Elapsed.ToString("hh\:mm\:ss")
# .... send email logic here
# for success result
break #=> This is important, don't remove it
}
# we don't need a else here,
# if we are here is because $result is false
$elapsedTime.Stop()
$duration = $elapsedTime.Elapsed.ToString("hh\:mm\:ss")
# .... send email logic here
# for unsuccessful result
break #=> This is important, don't remove it
}
# if this is running for more than
# 60 minutes break the loop
if($elapsedTime.Elapsed.Minutes -ge 60)
{
$elapsedTime.Stop()
$duration = $elapsedTime.Elapsed.ToString("hh\:mm\:ss")
# .... send email logic here
# for script running longer
# than 60 minutes
break #=> This is important, don't remove it
}
Start-Sleep -Milliseconds 500
}
Get-Job|Remove-Job
You indeed need Get-Content's -Wait switch to keep checking a file for new content in (near) real time (new content is checked for once every second).
However, doing so waits indefinitely, and only ends if the target files is deleted, moved, or renamed.
Therefore, with -Wait applied, your job may never reach status 'Completed' - but there's no need to wait for that, given that Receive-Job can receive job output while the job is running, as it becomes available.
However, you mustn't use Select-String's -Quiet switch, because it will only ever output one result, namely $true once the first match is found - and will produce no further output even if content added later also matches.
Therefore, you probably want something like the following:
$job = Start-Job {
# Use Get-Content with -Wait, but don't use Select-String with -Quiet
Get-Content $File -Raw -Wait | Select-string -Pattern "Idle"
}
while ($true)
{
# Check for available job output, if any.
if ($result = $job | Receive-Job) {
$duration = $elapsedTime.Elapsed.ToString("hh\:mm\:ss")
# .... send email logic here
# for success result
break
}
# if this is running for more than
# 60 minutes break the loop
if($elapsedTime.Elapsed.Minutes -ge 60)
{
$elapsedTime.Stop()
$duration = $elapsedTime.Elapsed.ToString("hh\:mm\:ss")
# .... send email logic here
# for script running longer
# than 60 minutes
# Forcefully remove the background job.
$job | Remove-Job -Force
break
}
Start-Sleep -Milliseconds 500
}
Note:
$job | Receive-Job either produces no output, if none happens to be available, or one or more [Microsoft.PowerShell.Commands.MatchInfo] instances reported by Select-Object.
Using this command as an if-statement conditional (combined with an assignment to $result) means that one or more [Microsoft.PowerShell.Commands.MatchInfo] instances make the conditional evaluate to $true, based on PowerShell's implicit to-Boolean coercion logic - see the bottom section of this answer.
Related
I use the following command in Powershell to convert files in the background but would like to log the results all in one file. Now the -RedirectStandardOutput replaces the file each run.
foreach ($l in gc ./files.txt) {Start-Process -FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" -Argumentlist "'$l' '$l.epub'" -Wait -WindowStyle Hidden -RedirectStandardOutput log.txt}
I tried with a redirect but then the log is empty.
If possible I would like to keep it a one-liner.
foreach ($l in gc ./files.txt) {Start-Process -FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" -Argumentlist "`"$l`" `"$l.epub`"" -Wait -WindowStyle Hidden *> log.txt}
If sequential, synchronous execution is acceptable, you can simplify your command to use a single output redirection (the assumption is that ebook-convert.exe is a console-subsystem application, which PowerShell therefore executes synchronously (in a blocking manner).:
Get-Content ./files.txt | ForEach-Object {
& 'c:\Program Files (x86)\calibre2\ebook-convert.exe' $_ "$_.epub"
} *> log.txt
Placing * before > tells PowerShell to redirect all output streams, which in the case of external programs means both stdout and stderr.
If you want to control the character encoding, use Out-File - which > effectively is an alias for - with its -Encoding parameter; or, preferably, with text output - which external-program output always is in PowerShell - Set-Content. To also capture stderr output, append *>&1 to the command in the pipeline segment before the Out-File / Set-Content call.
Note that PowerShell never passes raw output from external programs through to files - they are first always decoded into .NET strings, based on the encoding stored in [Console]::OutputEncoding (the system's active legacy OEM code page by default), and then re-encoded on saving to a file, using the file-writing cmdlet's own defaults, unless overridden with -Encoding - see this answer for more information.
If you want asynchronous, parallel execution (such as via Start-Process, which is asynchronous by default), your best bet is to:
write to separate (temporary) files:
Pass a different output file to -RedirectStandardOutput / -RedirectStandardError in each invocation.
Note that if you want to merge stdout and stderr output and capture it in the same file, you'll have to call your .exe file via a shell (possibly another PowerShell instance) and use its redirection features; for PowerShell, it would be *>log.txt; for cmd.exe (as shown below), it would be > log.txt 2>&1
wait for all launched processes to finish:
Pass -PassThru to Start-Process and collect the process-information objects returned.
Then use Wait-Process to wait for all processes to terminate; use the -Timeout parameter as needed.
and then merge them into a single log file.
Here's an implementation:
$procsAndLogFiles =
Get-Content ./files.txt | ForEach-Object -Begin { $i = 0 } {
# Create a distinct log file for each process,
# and return its name along with a process-information object representing
# each process as a custom object.
$logFile = 'log{0:000}.txt' -f ++$i
[pscustomobject] #{
LogFile = $logFile
Process = Start-Process -PassThru -WindowStyle Hidden `
-FilePath 'cmd.exe' `
-Argumentlist "/c `"`"c:\Program Files (x86)\calibre2\ebook-convert.exe`" `"$_`" `"$_.epub`" >`"$logFile`" 2>&1`""
}
}
# Wait for all processes to terminate.
# Add -Timeout and error handling as needed.
$procsAndLogFiles.Process | Wait-Process
# Merge all log files.
Get-Content -LiteralPath $procsAndLogFiles.LogFile > log.txt
# Clean up.
Remove-Item -LiteralPath $procsAndLogFiles.LogFile
If you want throttled parallel execution, so as to limit how many background processes can run at a time:
# Limit how many background processes may run in parallel at most.
$maxParallelProcesses = 10
# Initialize the log file.
# Use -Force to unconditionally replace an existing file.
New-Item log.txt
# Initialize the list in which those input files whose conversion
# failed due to timing out are recorded.
$allTimedOutFiles = [System.Collections.Generic.List[string]]::new()
# Process the input files in batches of $maxParallelProcesses
Get-Content -ReadCount $maxParallelProcesses ./files.txt |
ForEach-Object {
$i = 0
$launchInfos = foreach ($file in $_) {
# Create a distinct log file for each process,
# and return its name along with the input file name / path, and
# a process-information object representing each process, as a custom object.
$logFile = 'log{0:000}.txt' -f ++$i
[pscustomobject] #{
InputFile = $file
LogFile = $logFile
Process = Start-Process -PassThru -WindowStyle Hidden `
-FilePath 'cmd.exe' `
-ArgumentList "/c `"`"c:\Program Files (x86)\calibre2\ebook-convert.exe`" `"$file`" `"$_.epub`" >`"$file`" 2>&1`""
}
}
# Wait for the processes to terminate, with a timeout.
$launchInfos.Process | Wait-Process -Timeout 30 -ErrorAction SilentlyContinue -ErrorVariable errs
# If not all processes terminated within the timeout period,
# forcefully terminate those that didn't.
if ($errs) {
$timedOut = $launchInfos | Where-Object { -not $_.Process.HasExited }
Write-Warning "Conversion of the following input files timed out; the processes will killed:`n$($timedOut.InputFile)"
$timedOut.Process | Stop-Process -Force
$allTimedOutFiles.AddRange(#($timedOut.InputFile))
}
# Merge all temp. log files and append to the overall log file.
$tempLogFiles = Get-Content -ErrorAction Ignore -LiteralPath ($launchInfos.LogFile | Sort-Object)
$tempLogFiles | Get-Content >> log.txt
# Clean up.
$tempLogFiles | Remove-Item
}
# * log.txt now contains all combined logs
# * $allTimedOutFiles now contains all input file names / paths
# whose conversion was aborted due to timing out.
Note that the above throttling technique isn't optimal, because each batch of inputs is waited for together, at which point the next batch is started. A better approach is to launch a new process as soon as one of the available parallel "slots" up, as shown in the next section; however, note that PowerShell (Core) 7+ is required.
PowerShell (Core) 7+: Efficiently throttled parallel execution, using ForEach-Object -Parallel:
PowerShell (Core) 7+ introduced thread-based parallelism to the ForEach-Object cmdlet, via the -Parallel parameter, which has built-in throttling that defaults to a maximum of 5 threads by default, but can be controlled explicitly via the -ThrottleLimit parameter.
This enables efficient throttling, as a new thread is started as soon as an available slot opens up.
The following is a self-contained example that demonstrates the technique; it works on both Windows and Unix-like platforms:
Inputs are 9 integers, and the conversion process is simulated simply by sleeping a random number of seconds between 1 and 9, followed by echoing the input number.
A timeout of 6 seconds is applied to each child process, meaning that a random number of child processes will time out and be killed.
#requires -Version 7
# Use ForEach-Object -Parallel to launch child processes in parallel,
# limiting the number of parallel threads (from which the child processes are
# launched) via -ThrottleLimit.
# -AsJob returns a single job whose child jobs track the threads created.
$job =
1..9 | ForEach-Object -ThrottleLimit 3 -AsJob -Parallel {
# Determine a temporary, thread-specific log file name.
$logFile = 'log_{0:000}.txt' -f $_
# Pick a radom sleep time that may or may not be smaller than the timeout period.
$sleepTime = Get-Random -Minimum 1 -Maximum 9
# Launch the external program asynchronously and save information about
# the newly launched child process.
if ($env:OS -eq 'Windows_NT') {
$ps = Start-Process -PassThru -WindowStyle Hidden cmd.exe "/c `"timeout $sleepTime >NUL & echo $_ >$logFile 2>&1`""
}
else { # macOS, Linux
$ps = Start-Process -PassThru sh "-c `"{ sleep $sleepTime; echo $_; } >$logFile 2>&1`""
}
# Wait for the child process to exit within a given timeout period.
$ps | Wait-Process -Timeout 6 -ErrorAction SilentlyContinue
# Check if a timout has occurred (implied by the process not having exited yet)
$timedOut = -not $ps.HasExited
if ($timedOut) {
# Note: Only [Console]::WriteLine produces immediate output, directly to the display.
[Console]::WriteLine("Warning: Conversion timed out for: $_")
# Kill the timed-out process.
$ps | Stop-Process -Force
}
# Construct and output a custom object that indicates the input at hand,
# the associated log file, and whether a timeout occurred.
[pscustomobject] #{
InputFile = $_
LogFile = $logFile
TimedOut = $timedOut
}
}
# Wait for all child processes to exit or be killed
$processInfos = $job | Receive-Job -Wait -AutoRemoveJob
# Merge all temporary log files into an overall log file.
$tempLogFiles = Get-Item -ErrorAction Ignore -LiteralPath ($processInfos.LogFile | Sort-Object)
$tempLogFiles | Get-Content > log.txt
# Clean up the temporary log files.
$tempLogFiles | Remove-Item
# To illustrate the results, show the overall log file's content
# and which inputs caused timeouts.
[pscustomobject] #{
CombinedLogContent = Get-Content -Raw log.txt
InputsThatFailed = ($processInfos | Where-Object TimedOut).InputFile
} | Format-List
# Clean up the overall log file.
Remove-Item log.txt
You can use redirection and append to files if you don't use Start-Process, but a direct invocation:
foreach ($l in gc ./files.txt) {& 'C:\Program Files (x86)\calibre2\ebook-convert.exe' "$l" "$l.epub" *>> log.txt}
For the moment I'm using an adaption on mklement0's answer.
ebook-convert.exe often hangs so I need to close it down if the process takes longer than the designated time.
This needs to run asynchronous because the number of files and the processor time taken (5 to 25% depending on the conversion).
The timeout needs to be per file, not on the whole of the jobs.
$procsAndLogFiles =
Get-Content ./files.txt | ForEach-Object -Begin { $i = 0 } {
# Create a distinct log file for each process,
# and return its name along with a process-information object representing
# each process as a custom object.
$logFile = 'd:\temp\log{0:000}.txt' -f ++$i
Write-Host "$(Get-Date) $_"
[pscustomobject] #{
LogFile = $logFile
Process = Start-Process `
-PassThru `
-FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" `
-Argumentlist "`"$_`" `"$_.epub`"" `
-WindowStyle Hidden `
-RedirectStandardOutput $logFile `
| Wait-Process -Timeout 30
}
}
# Wait for all processes to terminate.
# Add -Timeout and error handling as needed.
$procsAndLogFiles.Process
# Merge all log files.
Get-Content -LiteralPath $procsAndLogFiles.LogFile > log.txt
# Clean up.
Remove-Item -LiteralPath $procsAndLogFiles.LogFile
Since the problem in my other answer was not completely solved (not killing all the processes that take longer than the timeout limit) I rewrote it in Ruby.
It's not powershell but if you land on this question and also know Ruby (or not) it could help you.
I believe it's the use of Threads that solves the killing issue.
require 'logger'
LOG = Logger.new("log.txt")
PROGRAM = 'c:\Program Files (x86)\calibre2\ebook-convert.exe'
LIST = 'E:\ebooks\english\_convert\mobi\files.txt'
TIMEOUT = 30
MAXTHREADS = 6
def run file, log: nil
output = ""
command = %Q{"#{PROGRAM}" "#{file}" "#{file}.epub" 2>&1}
IO.popen(command+" 2>&1") do |io|
begin
while (line=io.gets) do
output += line
log.info line.chomp if log
end
rescue => ex
log.error ex.message
system("taskkill /f /pid #{io.pid}") rescue log.error $#
end
end
if File.exist? "#{file}.epub"
puts "converted #{file}.epub"
File.delete(file)
else
puts "error #{file}"
end
output
end
threads = []
File.readlines(LIST).each do |file|
file.chomp! # remove line feed
# some checks
if !File.exist? file
puts "not found #{file}"
next
end
if File.exist? "#{file}.epub"
puts "skipping #{file}"
File.delete(file) if File.exist? file
next
end
# go on with the conversion
thread = Thread.new {run(file, log: LOG)}
threads << thread
next if threads.length < MAXTHREADS
threads.each do |t|
t.join(TIMEOUT)
unless t.alive?
t.kill
threads.delete(t)
end
end
end
I have two scripts I would like to combine, but the second script can't begin until a program (Photoshop) is closed. Script one ends by starting a photoshop script with Invoke-Item. Once the Photoshop script is complete PhotoShop closes. The second code archives the raw files with a simple Move-Item. With PowerShell, how can I know when PhotoShop is closed and begin my Move-Item?
I have spent some time researching this to see what documentation there is, but either I am asking my questions poorly or it is an obscure enough I can't find any leads to begin off of.
# Script One
ii "E:\resizerScript.jsx"
#Something to determine when PhotoShop is closed and begin the next bit of code.
# Script Two
Move-Item -path "E:\Staged\*" -Destination "E:\Archived"
I'm very new to coding and what I have is cobbled together from other articles. If anything is too unclear I would be happy to elaborate. Thanks in advance for any help or direction.
You can use Wait-Process,
Invoke-Item "E:\resizerScript.jsx"
Wait-Process photoshop
Move-Item -Path "E:\Staged\*" -Destination "E:\Archived"
but I recommend using Start-Process -Wait to start Photoshop.
$photoshopPath = "C:\...\Photoshop.exe"
Start-Process $photoshopPath "E:\resizerScript.jsx" -Wait
Move-Item -Path "E:\Staged\*" -Destination "E:\Archived"
If you want to set the timeout:
Start-Process $photoshopPath "E:\resizerScript.jsx" -PassThru |
Wait-Process -Timeout (15 * 60) -ErrorAction Stop
First, you need to find photoshop's process name. Open powershell and run
Get-Process | Select-Object -Property ProcessName
Then use the following (you can customize it according to your needs of course, I've tested using Outlook)
param(
[string]$procName = "Outlook",
[int]$timeout = 90, ## seconds
[int]$retryInterval = 1 ## seconds
)
$isProcActive = $true
$timer = [Diagnostics.Stopwatch]::StartNew()
# to check the process' name:
# Get-Process | Select-Object -Property ProcessName
while (($timer.Elapsed.TotalSeconds -lt $timeout) -and ($isProcActive)) {
$procId = (Get-Process | Where-Object -Property ProcessName -EQ $procName).Id
if ([string]::IsNullOrEmpty($procId))
{
Write-Host "$procName is finished"
$isProcActive = $false
}
}
$timer.Stop()
if ($isProcActive)
{
Write-Host "$procName did not finish on time, aborting operation..."
# maybe you want to kill it?
# Stop-Process -Name $procName
exit
}
# do whatever
[UPDATE] if you need to put this inside another script, you need to omit the param since this must be the 1st thing in a script. So it would look like:
# start of script
$procName = "Outlook"
$timeout = 90 ## seconds
$retryInterval = 1 ## seconds
$isProcActive = $true
# etc etc
Hope this helps,
Jim
What i am looking for is to take powershell and read the file content out to the speech synthesis module.
File name for this example will be read.txt.
Start of the Speech module:
Add-Type -AssemblyName System.speech
$Narrator1 = New-Object System.Speech.Synthesis.SpeechSynthesizer
$Narrator1.SelectVoice('Microsoft Zira Desktop')
$Narrator1.Rate = 2
$Location = "$env:userprofile\Desktop\read.txt"
$Contents = Get-Content $Location
Get-Content $Location -wait -Tail 2 | where {$Narrator1.Speak($Contents)}
This works once. I like to use the Clear-Content to wipe the read.txt after each initial read and have powershell wait until new line is added to the read.txt file then process it again to speak the content. I believe I can also make it run in the background with -windowstyle hidden
Thank you in advanced for any assistance.
Scott
I don't think a loop is the answer, I would use the FileSystemWatcher to detect when the file has changed. Try this:
$fsw = New-Object System.IO.FileSystemWatcher
$fsw.Path = "$env:userprofile\Desktop"
$fsw.Filter = 'read.txt'
Register-ObjectEvent -InputObject $fsw -EventName Changed -Action {
Add-Type -AssemblyName System.speech
$Narrator1 = New-Object System.Speech.Synthesis.SpeechSynthesizer
$Narrator1.SelectVoice('Microsoft Zira Desktop')
$Narrator1.Rate = 2
$file = $Event.SourceEventArgs.FullPath
$Contents = Get-Content $file
$Narrator1.Speak($Contents)
}
Your only problem was that you accidentally used the previously assigned $Contents variable in the where (Where-Object) script block rather than $_, the automatic variable representing the current pipeline object:
Get-Content $Location -Wait -Tail 2 | Where-Object { $Narrator1.Speak($_) }
Get-Content $Location -Wait will poll the input file ($Location here) every second to check for new content and pass it through the pipeline (the -Tail argument only applies to the initial reading of the file; as new lines are added, they are all passed through).
The pipeline will stay alive indefinitely - until you delete the $Location file or abort processing.
Since the command is blocking, you obviously need another session / process to add content to file $Location, such as another PowerShell window or a text editor that has the file open and modifies its content.
You can keep appending to the file with >>, but that will keep growing it.
To discard the file's previous content, you must indeed use Clear-Content, as you say, which truncates the existing file without recreating it, and therefore keeps the pipeline alive; e.g.:
Clear-Content $Location
'another line to speak' > $Location
Caveat: Special chars. such as ! and ? seem to cause silent failure to speak. If anyone knows why, do tell us. The docs offer no immediate clues.
As for background operation:
With a background job, curiously, the Clear-Content / > combination appears not to work; if anybody knows why, please tell us.
However, using >> - which grows the file - does work.
The following snippet demonstrates the use of a background job to keep speaking input as it is being added to a specified file (with some delay), until a special end-of-input string is sent:
# Determine the input file (on the user's desktop)
$file = Join-Path ([environment]::GetFolderPath('Desktop')) 'read.txt'
# Initialize the input file.
$null > $file
# Define a special string that acts as the end-of-input marker.
$eofMarker = '[quit]'
# Start the background job (PSv3+ syntax)
$job = Start-Job {
Add-Type -AssemblyName System.speech
$Narrator1 = New-Object System.Speech.Synthesis.SpeechSynthesizer
$Narrator1.SelectVoice('Microsoft Zira Desktop')
$Narrator1.Rate = 2
while ($true) { # A dummy loop we can break out of on receiving the end-of-input marker
Get-Content $using:file -Wait | Where-Object {
if ($_ -eq $using:eofMarker) { break } # End-of-input marker received -> exit the pipeline.
$Narrator1.Speak($_)
}
}
# Remove the input file.
Remove-Item -ErrorAction Ignore -LiteralPath $using:file
}
# Speak 1, 2, ..., 10
1..10 | ForEach-Object {
Write-Verbose -Verbose $_
# !! Inexplicably, using Clear-Content followed by > to keep
# !! replacing the file content does *not* work with a background task.
# !! >> - which *appends* to the file - does work, however.
$_ >> $file
}
# Send the end-of-input marker to make the background job stop reading.
$eofMarker >> $file
# Wait for background processing to finish.
# Note: We'll get here long before the background job has finished speaking.
Write-Verbose -Verbose 'Waiting for processing to finish to cleanup...'
$null = Receive-Job $job -wait -AutoRemoveJob
I have a PowerShell script that spawns x number of other PowerShell scripts in a Fire-And-Forget way.
In order to keep track of the progress of all the scripts that I just start, I create a temp file, where I have all of them write log messages in json format to report progress.
In the parent script I then monitor that log file using Get-Content -Wait. Whenever I receive a line in the log file, I parse the json and update an array of objects that I then display using Format-Table. That way I can see how far the different scripts are in their process and if they fail at a specific step. That works well... almost.
I keep running into IOErrors because so many scripts are accessing the log file, and when that happens the script just aborts and I lose all information on what is going on.
I would be able to live with the spawned scripts running into an IOError because they just continue and then I just catch the next message. I can live with some messages getting lost as this is not an audit log, but just a progress log.
But when the script that tails the log crashes then I lose insight.
I have tried to wrap this in a Try/Catch but that doesn't help. I have tried setting -ErrorAction Stop inside the Try/Catch but that still doesn't catch the error.
My script that reads looks like this:
function WatchLogFile($statusFile)
{
Write-Host "Tailing statusfile: $($statusFile)"
Write-Host "Press CTRL-C to end."
Write-Host ""
Try {
Get-Content $statusFile -Force -Wait |
ForEach {
$logMsg = $_ | ConvertFrom-JSON
#Update status on step for specific service
$svc = $services | Where-Object {$_.Service -eq $logMsg.Service}
$svc.psobject.properties[$logMsg.step].value = $logMsg.status
Clear-Host
$services | Format-Table -Property Service,Old,New,CleanRepo,NuGet,Analyzers,CleanImports,Build,Invoke,Done,LastFailure
} -ErrorAction Stop
} Catch {
WatchLogFile $statusFile
}
}
And updates are written like this in the spawned scripts
Add-Content $statusFile $jsonLogMessage
Is there an easy way to add retries or how can I make sure my script survives file locks?
As #ChiliYago pointed out I should use jobs. So that is what I have done now. I had to figure out how to get the output as it arrived from the many scripts.
So I did added all my jobs to an array of jobs and and monitored them like this. Beware that you can receive multiple lines if your script has had multiple outputs since you invoked Receive-Job. Be sure to use Write-Output from the scripts you execute as jobs.
$jobs=#()
foreach ($script in $scripts)
{
$sb = [scriptblock]::create("$script $(&{$args} #jobArgs)")
$jobs += Start-Job -ScriptBlock $sb
}
while ($hasRunningJobs -gt 0)
{
$runningJobs = $jobs | Where-Object {$_.State -eq "Running"} | measure
$hasRunningJobs = $runningJobs.Count
foreach ($job in $jobs)
{
$outvar = Receive-Job -Job $job
if ($outvar)
{
$outvar -split "`n" | %{ UpdateStatusTable $_}
}
}
}
Write-Host "All scripts done."
I spent days trying to implement a parallel jobs and queues system, but... I tried but I can't make it. Here is the code without implementing nothing, and CSV example from where looks.
I'm sure this post can help other users in their projects.
Each user have his pc, so the CSV file look like:
pc1,user1
pc2,user2
pc800,user800
CODE:
#Source File:
$inputCSV = '~\desktop\report.csv'
$csv = import-csv $inputCSV -Header PCName, User
echo $csv #debug
#Output File:
$report = "~\desktop\output.csv"
#---------------------------------------------------------------
#Define search:
$findSize = 40GB
Write-Host "Lonking for $findSize GB sized Outlook files"
#count issues:
$issues = 0
#---------------------------------------------------------------
foreach($item in $csv){
if (Test-Connection -Quiet -count 1 -computer $($item.PCname)){
$w7path = "\\$($item.PCname)\c$\users\$($item.User)\appdata\Local\microsoft\outlook"
$xpPath = "\\$($item.PCname)\c$\Documents and Settings\$($item.User)\Local Settings\Application Data\Microsoft\Outlook"
if(Test-Path $W7path){
if(Get-ChildItem $w7path -Recurse -force -Include *.ost -ErrorAction "SilentlyContinue" | Where-Object {$_.Length -gt $findSize}){
$newLine = "{0},{1},{2}" -f $($item.PCname),$($item.User),$w7path
$newLine | add-content $report
$issues ++
Write-Host "Issue detected" #debug
}
}
elseif(Test-Path $xpPath){
if(Get-ChildItem $w7path -Recurse -force -Include *.ost -ErrorAction "SilentlyContinue" | Where-Object {$_.Length -gt $findSize}){
$newLine = "{0},{1},{2}" -f $($item.PCname),$($item.User),$xpPath
$newLine | add-content $report
$issues ++
Write-Host "Issue detected" #debug
}
}
else{
write-host "Error! - bad path"
}
}
else{
write-host "Error! - no ping"
}
}
Write-Host "All done! detected $issues issues"
Parallel data processing in PowerShell is not quite simple, especially with
queueing. Try to use some existing tools which have this already done.
You may take look at the module
SplitPipeline. The cmdlet
Split-Pipeline is designed for parallel input data processing and supports
queueing of input (see the parameter Load). For example, for 4 parallel
pipelines with 10 input items each at a time the code will look like this:
$csv | Split-Pipeline -Count 4 -Load 10, 10 {process{
<operate on input item $_>
}} | Out-File $outputReport
All you have to do is to implement the code <operate on input item $_>.
Parallel processing and queueing is done by this command.
UPDATE for the updated question code. Here is the prototype code with some
remarks. They are important. Doing work in parallel is not the same as
directly, there are some rules to follow.
$csv | Split-Pipeline -Count 4 -Load 10, 10 -Variable findSize {process{
# Tips
# - Operate on input object $_, i.e $_.PCname and $_.User
# - Use imported variable $findSize
# - Do not use Write-Host, use (for now) Write-Warning
# - Do not count issues (for now). This is possible but make it working
# without this at first.
# - Do not write data to a file, from several parallel pipelines this
# is not so trivial, just output data, they will be piped further to
# the log file
...
}} | Set-Content $report
# output from all jobs is joined and written to the report file
UPDATE: How to write progress information
SplitPipeline handled pretty well a 800 targets csv, amazing. Is there anyway
to let the user know if the script is alive...? Scan a big csv can take about
20 mins. Something like "in progress 25%","50%","75%"...
There are several options. The simplest is just to invoke Split-Pipeline with
the switch -Verbose. So you will get verbose messages about the progress and
see that the script is alive.
Another simple option is to write and watch verbose messages from the jobs,
e.g. Write-Verbose ... -Verbose which will write messages even if
Split-Pipeline is invoked without Verbose.
And another option is to use proper progress messages with Write-Progress.
See the scripts:
Test-ProgressJobs.ps1
Test-ProgressTotal.ps1
Test-ProgressTotal.ps1 also shows how to use a collector updated from jobs
concurrently. You can use the similar technique for counting issues (the
original question code does this). When all is done show the total number of
issues to a user.