I use the following command in Powershell to convert files in the background but would like to log the results all in one file. Now the -RedirectStandardOutput replaces the file each run.
foreach ($l in gc ./files.txt) {Start-Process -FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" -Argumentlist "'$l' '$l.epub'" -Wait -WindowStyle Hidden -RedirectStandardOutput log.txt}
I tried with a redirect but then the log is empty.
If possible I would like to keep it a one-liner.
foreach ($l in gc ./files.txt) {Start-Process -FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" -Argumentlist "`"$l`" `"$l.epub`"" -Wait -WindowStyle Hidden *> log.txt}
If sequential, synchronous execution is acceptable, you can simplify your command to use a single output redirection (the assumption is that ebook-convert.exe is a console-subsystem application, which PowerShell therefore executes synchronously (in a blocking manner).:
Get-Content ./files.txt | ForEach-Object {
& 'c:\Program Files (x86)\calibre2\ebook-convert.exe' $_ "$_.epub"
} *> log.txt
Placing * before > tells PowerShell to redirect all output streams, which in the case of external programs means both stdout and stderr.
If you want to control the character encoding, use Out-File - which > effectively is an alias for - with its -Encoding parameter; or, preferably, with text output - which external-program output always is in PowerShell - Set-Content. To also capture stderr output, append *>&1 to the command in the pipeline segment before the Out-File / Set-Content call.
Note that PowerShell never passes raw output from external programs through to files - they are first always decoded into .NET strings, based on the encoding stored in [Console]::OutputEncoding (the system's active legacy OEM code page by default), and then re-encoded on saving to a file, using the file-writing cmdlet's own defaults, unless overridden with -Encoding - see this answer for more information.
If you want asynchronous, parallel execution (such as via Start-Process, which is asynchronous by default), your best bet is to:
write to separate (temporary) files:
Pass a different output file to -RedirectStandardOutput / -RedirectStandardError in each invocation.
Note that if you want to merge stdout and stderr output and capture it in the same file, you'll have to call your .exe file via a shell (possibly another PowerShell instance) and use its redirection features; for PowerShell, it would be *>log.txt; for cmd.exe (as shown below), it would be > log.txt 2>&1
wait for all launched processes to finish:
Pass -PassThru to Start-Process and collect the process-information objects returned.
Then use Wait-Process to wait for all processes to terminate; use the -Timeout parameter as needed.
and then merge them into a single log file.
Here's an implementation:
$procsAndLogFiles =
Get-Content ./files.txt | ForEach-Object -Begin { $i = 0 } {
# Create a distinct log file for each process,
# and return its name along with a process-information object representing
# each process as a custom object.
$logFile = 'log{0:000}.txt' -f ++$i
[pscustomobject] #{
LogFile = $logFile
Process = Start-Process -PassThru -WindowStyle Hidden `
-FilePath 'cmd.exe' `
-Argumentlist "/c `"`"c:\Program Files (x86)\calibre2\ebook-convert.exe`" `"$_`" `"$_.epub`" >`"$logFile`" 2>&1`""
}
}
# Wait for all processes to terminate.
# Add -Timeout and error handling as needed.
$procsAndLogFiles.Process | Wait-Process
# Merge all log files.
Get-Content -LiteralPath $procsAndLogFiles.LogFile > log.txt
# Clean up.
Remove-Item -LiteralPath $procsAndLogFiles.LogFile
If you want throttled parallel execution, so as to limit how many background processes can run at a time:
# Limit how many background processes may run in parallel at most.
$maxParallelProcesses = 10
# Initialize the log file.
# Use -Force to unconditionally replace an existing file.
New-Item log.txt
# Initialize the list in which those input files whose conversion
# failed due to timing out are recorded.
$allTimedOutFiles = [System.Collections.Generic.List[string]]::new()
# Process the input files in batches of $maxParallelProcesses
Get-Content -ReadCount $maxParallelProcesses ./files.txt |
ForEach-Object {
$i = 0
$launchInfos = foreach ($file in $_) {
# Create a distinct log file for each process,
# and return its name along with the input file name / path, and
# a process-information object representing each process, as a custom object.
$logFile = 'log{0:000}.txt' -f ++$i
[pscustomobject] #{
InputFile = $file
LogFile = $logFile
Process = Start-Process -PassThru -WindowStyle Hidden `
-FilePath 'cmd.exe' `
-ArgumentList "/c `"`"c:\Program Files (x86)\calibre2\ebook-convert.exe`" `"$file`" `"$_.epub`" >`"$file`" 2>&1`""
}
}
# Wait for the processes to terminate, with a timeout.
$launchInfos.Process | Wait-Process -Timeout 30 -ErrorAction SilentlyContinue -ErrorVariable errs
# If not all processes terminated within the timeout period,
# forcefully terminate those that didn't.
if ($errs) {
$timedOut = $launchInfos | Where-Object { -not $_.Process.HasExited }
Write-Warning "Conversion of the following input files timed out; the processes will killed:`n$($timedOut.InputFile)"
$timedOut.Process | Stop-Process -Force
$allTimedOutFiles.AddRange(#($timedOut.InputFile))
}
# Merge all temp. log files and append to the overall log file.
$tempLogFiles = Get-Content -ErrorAction Ignore -LiteralPath ($launchInfos.LogFile | Sort-Object)
$tempLogFiles | Get-Content >> log.txt
# Clean up.
$tempLogFiles | Remove-Item
}
# * log.txt now contains all combined logs
# * $allTimedOutFiles now contains all input file names / paths
# whose conversion was aborted due to timing out.
Note that the above throttling technique isn't optimal, because each batch of inputs is waited for together, at which point the next batch is started. A better approach is to launch a new process as soon as one of the available parallel "slots" up, as shown in the next section; however, note that PowerShell (Core) 7+ is required.
PowerShell (Core) 7+: Efficiently throttled parallel execution, using ForEach-Object -Parallel:
PowerShell (Core) 7+ introduced thread-based parallelism to the ForEach-Object cmdlet, via the -Parallel parameter, which has built-in throttling that defaults to a maximum of 5 threads by default, but can be controlled explicitly via the -ThrottleLimit parameter.
This enables efficient throttling, as a new thread is started as soon as an available slot opens up.
The following is a self-contained example that demonstrates the technique; it works on both Windows and Unix-like platforms:
Inputs are 9 integers, and the conversion process is simulated simply by sleeping a random number of seconds between 1 and 9, followed by echoing the input number.
A timeout of 6 seconds is applied to each child process, meaning that a random number of child processes will time out and be killed.
#requires -Version 7
# Use ForEach-Object -Parallel to launch child processes in parallel,
# limiting the number of parallel threads (from which the child processes are
# launched) via -ThrottleLimit.
# -AsJob returns a single job whose child jobs track the threads created.
$job =
1..9 | ForEach-Object -ThrottleLimit 3 -AsJob -Parallel {
# Determine a temporary, thread-specific log file name.
$logFile = 'log_{0:000}.txt' -f $_
# Pick a radom sleep time that may or may not be smaller than the timeout period.
$sleepTime = Get-Random -Minimum 1 -Maximum 9
# Launch the external program asynchronously and save information about
# the newly launched child process.
if ($env:OS -eq 'Windows_NT') {
$ps = Start-Process -PassThru -WindowStyle Hidden cmd.exe "/c `"timeout $sleepTime >NUL & echo $_ >$logFile 2>&1`""
}
else { # macOS, Linux
$ps = Start-Process -PassThru sh "-c `"{ sleep $sleepTime; echo $_; } >$logFile 2>&1`""
}
# Wait for the child process to exit within a given timeout period.
$ps | Wait-Process -Timeout 6 -ErrorAction SilentlyContinue
# Check if a timout has occurred (implied by the process not having exited yet)
$timedOut = -not $ps.HasExited
if ($timedOut) {
# Note: Only [Console]::WriteLine produces immediate output, directly to the display.
[Console]::WriteLine("Warning: Conversion timed out for: $_")
# Kill the timed-out process.
$ps | Stop-Process -Force
}
# Construct and output a custom object that indicates the input at hand,
# the associated log file, and whether a timeout occurred.
[pscustomobject] #{
InputFile = $_
LogFile = $logFile
TimedOut = $timedOut
}
}
# Wait for all child processes to exit or be killed
$processInfos = $job | Receive-Job -Wait -AutoRemoveJob
# Merge all temporary log files into an overall log file.
$tempLogFiles = Get-Item -ErrorAction Ignore -LiteralPath ($processInfos.LogFile | Sort-Object)
$tempLogFiles | Get-Content > log.txt
# Clean up the temporary log files.
$tempLogFiles | Remove-Item
# To illustrate the results, show the overall log file's content
# and which inputs caused timeouts.
[pscustomobject] #{
CombinedLogContent = Get-Content -Raw log.txt
InputsThatFailed = ($processInfos | Where-Object TimedOut).InputFile
} | Format-List
# Clean up the overall log file.
Remove-Item log.txt
You can use redirection and append to files if you don't use Start-Process, but a direct invocation:
foreach ($l in gc ./files.txt) {& 'C:\Program Files (x86)\calibre2\ebook-convert.exe' "$l" "$l.epub" *>> log.txt}
For the moment I'm using an adaption on mklement0's answer.
ebook-convert.exe often hangs so I need to close it down if the process takes longer than the designated time.
This needs to run asynchronous because the number of files and the processor time taken (5 to 25% depending on the conversion).
The timeout needs to be per file, not on the whole of the jobs.
$procsAndLogFiles =
Get-Content ./files.txt | ForEach-Object -Begin { $i = 0 } {
# Create a distinct log file for each process,
# and return its name along with a process-information object representing
# each process as a custom object.
$logFile = 'd:\temp\log{0:000}.txt' -f ++$i
Write-Host "$(Get-Date) $_"
[pscustomobject] #{
LogFile = $logFile
Process = Start-Process `
-PassThru `
-FilePath "c:\Program Files (x86)\calibre2\ebook-convert.exe" `
-Argumentlist "`"$_`" `"$_.epub`"" `
-WindowStyle Hidden `
-RedirectStandardOutput $logFile `
| Wait-Process -Timeout 30
}
}
# Wait for all processes to terminate.
# Add -Timeout and error handling as needed.
$procsAndLogFiles.Process
# Merge all log files.
Get-Content -LiteralPath $procsAndLogFiles.LogFile > log.txt
# Clean up.
Remove-Item -LiteralPath $procsAndLogFiles.LogFile
Since the problem in my other answer was not completely solved (not killing all the processes that take longer than the timeout limit) I rewrote it in Ruby.
It's not powershell but if you land on this question and also know Ruby (or not) it could help you.
I believe it's the use of Threads that solves the killing issue.
require 'logger'
LOG = Logger.new("log.txt")
PROGRAM = 'c:\Program Files (x86)\calibre2\ebook-convert.exe'
LIST = 'E:\ebooks\english\_convert\mobi\files.txt'
TIMEOUT = 30
MAXTHREADS = 6
def run file, log: nil
output = ""
command = %Q{"#{PROGRAM}" "#{file}" "#{file}.epub" 2>&1}
IO.popen(command+" 2>&1") do |io|
begin
while (line=io.gets) do
output += line
log.info line.chomp if log
end
rescue => ex
log.error ex.message
system("taskkill /f /pid #{io.pid}") rescue log.error $#
end
end
if File.exist? "#{file}.epub"
puts "converted #{file}.epub"
File.delete(file)
else
puts "error #{file}"
end
output
end
threads = []
File.readlines(LIST).each do |file|
file.chomp! # remove line feed
# some checks
if !File.exist? file
puts "not found #{file}"
next
end
if File.exist? "#{file}.epub"
puts "skipping #{file}"
File.delete(file) if File.exist? file
next
end
# go on with the conversion
thread = Thread.new {run(file, log: LOG)}
threads << thread
next if threads.length < MAXTHREADS
threads.each do |t|
t.join(TIMEOUT)
unless t.alive?
t.kill
threads.delete(t)
end
end
end
I'm using FFmpeg with PowerShell.
I have a loop that goes through a folder of mpg files and grabs the names to a variable $inputName.
FFmpeg then converts each one to an mp4.
Works
Batch Processing
$files = Get-ChildItem "C:\Path\" -Filter *.mpg;
foreach ($f in $files) {
$inputName = $f.Name; #name + extension
$outputName = (Get-Item $inputName).Basename; #name only
ffmpeg -y -i "C:\Users\Matt\Videos\$inputName" -c:v libx264 -crf 25 "C:\Users\Matt\Videos\$outputName.mp4"
}
Not Working
Batch Processing with Process Priority
$files = Get-ChildItem "C:\Path\" -Filter *.mpg;
foreach ($f in $files) {
$inputName = $f.Name; #name + extension
$outputName = (Get-Item $inputName).Basename; #name only
($Process = Start-Process ffmpeg -NoNewWindow -ArgumentList '-y -i "C:\Users\Matt\Videos\$inputName" -c:v libx264 -crf 25 "C:\Users\Matt\Videos\$outputName.mp4"' -PassThru).PriorityClass = [System.Diagnostics.ProcessPriorityClass]::AboveNormal;
Wait-Process -Id $Process.id
}
If I set the Process Priority using Start-Process PriorityClass, the $inputName variable is no longer recognized.
Error:
C:\Users\Matt\Videos\$inputName: No such file or directory
Lets go over a few basic things.
In powershell we love piping |, It allows use to pass the information from one command to another command.
A good example of this is the ForEach you have.
Instead of Foreach($F in $Files) you can pipe | into a foreach-object
Get-ChildItem "C:\Path\" -Filter *.mpg | Foreach-Object{
$_
}
When Piping | a command powershell automatically creates the variable $_ which is the object that is passed in the pipe |
The next thing is there are 2 types of quotes " and '.
If you use ' then everthing is taken literally. Example
$FirstName = "TestName"
'Hey There $FirstName'
Will return
Hey There $FirstName
While " allows you to use Variables in it. Example
$FirstName = "TestName"
'Hey There $FirstName'
Will return
Hey There TestName
Now one last thing before we fix this. In powershell we have a escape char ` aka a tick. Its located beside the number 1 on the keyboard with the tilde. You use it to allow the use of char that would otherwise break out of the qoutes. Example
"`"Hey There`""
Would return
"Hey There"
OK so now that we covered the basics lets fix up the script
Get-ChildItem "C:\Users\Matt\Videos\" -Filter *.mpg -File | Foreach-Object{
($Process = Start-Process ffmpeg -NoNewWindow -ArgumentList "-y -i `"$($_.FullName)`" -c:v libx264 -crf 25 `"C:\Users\Matt\Videos\$($_.Name)`"" -PassThru).PriorityClass = [System.Diagnostics.ProcessPriorityClass]::AboveNormal;
Try{
Wait-Process -Id $Process.id
}catch{
}
}
In the case above I changed
Add -File to the Get-ChildItem to designate that you only want Files returned not folders
Pipe | into a Foreach-Object
Changed the Outside Brackets in the -ArgumentList to be double quotes " instead of literal quotes '
Removed the $InputName and $OutputName in favor of the Foreach-Object variable $_
This question already has answers here:
How to handle errors for the commands to run in Start-Job?
(1 answer)
Silent installation by using Powershell scripting
(2 answers)
Closed 4 years ago.
I need to make a simple script for DB logical dump. The goal to use the script for two purposes.
If I run it with parameter (DB names) it shall create a logical dump of those DBs.
If I run without parameters it starts command for a list of DBs (hardcoded).
I want to check error code for each command start (pg_dump) inside foreach loop, log it and continue.
What's the best way to do it?
So far I dicovered that I can use try and catch.
Side-note: In my code I tried try..catch only once for testing purposes.
$path = '"C:\Program Files\PostgreSQL\version\bin\pg_dump.exe"'
$backup_path = 'D:\Backups\test\'
$limit = (Get-Date).AddDays(-2)
$logdate = (Get-Date).ToString("yyyy-MM-dd")
$pg_suffix = "pg_dump"
#$LogFile = $backup_path\$pg_suffix_$(Get-Date -f yyyy-MM-dd) + '.log'
$pg_dump_error = "pg_dump has failed"
$p = $args
[array]$DB_Array = #('postgres', 'db02', 'db03')
if ($p -ne $null) {
try {
foreach ($DB in $p) {
$backup_path_temp = $backup_path + $DB + '_' + $(Get-Date -f yyyy-MM-dd) + '.backup'
cmd /c "$path -w -h localhost -U postgres -Z3 -Fd -j 12 -f $backup_path_temp $DB"
}
} catch {
#"Error! $pg_dump_error" | Tee-Object -FilePath $LogFile -Append | Write-Error }
Write "Error: $file.name: $_" >>D:\\Backups\logfile.txt
}
continue;
} else {
foreach ($DB in $DB_Array) {
$backup_path_temp = $backup_path + $DB + '_' + $(Get-Date -f yyyy-MM-dd) + '.backup'
cmd /c "$path -w -h localhost -U postgres -Z3 -Fd -j 12 -f $backup_path_temp $DB"
}
}
# Delete files older than the $limit.
Get-ChildItem -Path $backup_path -Force |
Where-Object { !$_.PSIsContainer -and $_.CreationTime -lt $limit } |
Remove-Item -Force -Recurse
Use $LastExitcode automatic variable for getting the exit code of a executable in PowerShell.
I have a tool that logs some data onto a file. I'd like to tail the file and send the last line of data via mosquitto_pub.
I've used powershell "Get-Content" command without succes.
Here's my command:
Get-Content -Path "C:\test.txt" -Wait | .\mosquitto_pub.exe -t "Events"
But nothing is published by mosquitto_pub.
If I use Get-Content -Path "C:\test.txt" -Wait
I see the tail of the file in stdout.
What's wrong with my solution?
Thanks!
Read this Q and A.
An alternate approach
$minsToRunFor = 10
$secondsToRunFor = $minsToRunFor * 60
foreach ($second in $secondsToRunFor){
$lastline = Get-Content -Path "C:\test.txt" | Select-Object -last 1
# added condition as per VonPryz's good point
# (otherwise will add lastline regardless of whether it's new or not)
if ($lastline -ne $oldlastline){
.\mosquitto_pub.exe -t "Events" -m "$lastline"
}
$oldlastline = $lastline
Start-Sleep 100
}
We have the following unix command:
/usr/bin/tail -n 1 %{path} | grep --silent -F "%{message}" && rm -f %{path}%
This:
/usr/bin/tail -n 1 %{path} gets the last line in the file that the path variable refers to
| grep --silent -F "%{message}" pipes the output to another command, grep, which checks if the output of the previous command is equal to the value of message
&& rm -f %{path}% if the values are equal, then delete the file refered to by path
The above line is in a configuration file which is allows for calls to be made to the underlying operating system.
I want to replicate the functionalirty on windows.
I tried this:
command => 'powershell -Command "& {Get-Item $args[0] | ? { (Get-Content $_ -Tail 1).Contains($args[1]) }| Remove-Item -Force}" "'%path%'" "'%message%'"'
This error is thrown:
Error: Expected one of #, {, } at line 15, column 131 (byte 498)
Line 15 is the line in the configuration file which contains the above.
Thanks
PowerShell solution:
$path = 'C:\path\to\your.txt'
$message = 'message'
Get-Item $path | ? { (Get-Content $_ -Tail 1).Contains($message) } | Remove-Item -Force
If you want to run it from a command line, call it like this:
powershell -Command "& {Get-Item $args[0] | ? { (Get-Content $_ -Tail 1).Contains($args[1]) } | Remove-Item -Force}" "'C:\path\to\your.txt'" "'message'"
You can use tailhead.bat (pure batch script utility) that can be used to show lasts/fists lines of a file.Instead of Grep you can use findstr or find :
tailhead.bat tailhead -file=%pathToFile% -begin=-3|find "%message%"