I have a script which calls out to an external program (SoX) for each file in a directory. I'm calling Write-Progress before each call to SoX, but the progress bar is pushed off the top of the console buffer by the output of SoX (regardless of the size of the console). Is there anything I can do to avoid this?
Here's the script:
$audioFiles = ls -Exclude *.ps1 | ? { !$_.PSIsContainer }
foreach ($audioFile in $audioFiles)
{
$i++
Write-Progress -Activity "Transforming Audio" -Status $audioFile.Name -PercentComplete (($i / #($audioFiles).length) * 100)
& 'C:\Program Files (x86)\sox-14-4-0\sox.exe' "$audioFile" ('Fast/' + $audioFile.Name) -S -G tempo -s 1.3
}
Write-Progress -Activity "Transforming Audio" -PercentComplete 100 -Completed
[void] [Reflection.Assembly]::LoadWithPartialName(“System.Windows.Forms”)
[windows.forms.messagebox]::show(“All done!”)
Maybe you can set the same console cursor position before the sox output ?
please try
$audioFiles = ls -Exclude *.ps1 | ? { !$_.PSIsContainer }
foreach ($audioFile in $audioFiles){
$i++
Write-Progress -Activity "Transforming Audio" -Status $audioFile.Name -PercentComplete (($i / #($audioFiles).length) * 100)
# get windows height
$y=[int]($host.ui.rawui.WindowSize.Height -5)
# will set cursor position to bottom of the screen
$Host.UI.RawUI.CursorPosition = New-Object System.Management.Automation.Host.Coordinates 2,$y
#clear current line
$sbOut = new-object System.Text.Stringbuilder
(0.. $Host.UI.RawUI.WindowSize.Width)|%{$sbOut.append(' ')} |out-Null
write-Host $sbOut.toString() -NoNewline
& 'C:\Program Files (x86)\sox-14-4-0\sox.exe' "$audioFile" ('Fast/' + $audioFile.Name) -S -G tempo -s 1.3
}
Write-Progress -Activity "Transforming Audio" -PercentComplete 100 -Completed
[void] [Reflection.Assembly]::LoadWithPartialName(“System.Windows.Forms”)
[windows.forms.messagebox]::show(“All done!”)
you could also redirect sox output to a file or if you dont care about sox output just redirect it to out-null :
& 'C:\Program Files (x86)\sox-14-4-0\sox.exe' "$audioFile" ('Fast/' + $audioFile.Name) -S -G tempo -s 1.3 | out-null
Related
Below PowerShell code can help you get motivated to write your own alias for ninja like activity
Replace %userprofile% with your username
Replace [hints] with the original path
Type $profile in your powershell cli, and append the below code to get started
Will this activity drag the systems performance?
function poweroff_ {
shutdown -s -f
}
function hibernate_ {
shutdown -h
}
function restart_ {
shutdown -r -f
}
function eject_usb($driveletter){
# safely eject the mention drive eg:- d:\
$driveEject = New-Object -comObject Shell.Application
$driveEject.Namespace(17).ParseName($driveletter).InvokeVerb("Eject")
}
function cleanup_ {
#remove clutter from temp directory
rm -r -fo "C:\Users\%userprofile%\AppData\Local\Temp"
#clear Recycle-bin
clear-RecycleBin -confirm:$false
}
function pomodoro_timer{
#Reference: https://en.wikipedia.org/wiki/Pomodoro_Technique
Write-Host "'f' for Foucs(40 mins), 'sb' fo Short Break(10 mins), 'lb' for Long Break(30 mins)"
$logic = Read-Host "Ready? [f/sb/lb]"
if($logic -eq "f"){
$minutes = 40
} elseif ($logic -eq "sb") {
$minutes = 10
} elseif ($logic -eq "lb") {
$minutes = 30
}
#To-Do: Include background music [vlc] for each mode, and upon completing - play alert sound [custom sound, from local path]
$seconds = $minutes * 60
$delay = 1 #seconds between ticks
for ($i = $seconds; $i -gt 0; $i = $i - $delay) {
$percentComplete = 100 - (($i / $seconds) * 100)
Write-Progress -SecondsRemaining $i `
-Activity "Pomodoro Focus sessions" `
-Status "Time remaining:" `
-PercentComplete $percentComplete
if ($i -eq 16){Write-Host "Wrapping up, you will be available in $i seconds" -ForegroundColor Green}
Start-Sleep -Seconds $delay
}#Timer ended
}
# Native command renaming
set-alias -Name unzip -Value expand-archive
# Path specific
set-alias -Name np -Value notepad.exe
set-alias -name notes -Value C:\Users\%userprofile%\AppData\Local\[notetaking app.exe]
set-alias -name brave -value C:\Users\%userprofile%\AppData\Local\BraveSoftware\Brave-Browser\Application\brave.exe
Set-Alias -name paswdmng -value C:\Users\%userprofile[paswword manager app.exe]
Set-Alias -Name vs -value "C:\Users\%userprofile%\AppData\Local\Programs\Microsoft VS Code\Code.exe"
Set-Alias -Name outlook -Value "C:\Users\%userprofile%\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Brave Apps\Outlook.lnk"
Set-Alias -Name teams -Value "C:\Users\%userprofile%\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Microsoft Teams.lnk"
Set-Alias -Name vlc -Value "C:\Program Files\VideoLAN\VLC\vlc.exe"
# Invoking custom fuctions
New-Alias poweroff poweroff_
New-Alias frezze hibernate_
New-Alias restart restart_
New-Alias eject eject_usb
New-Alias cleanup cleanup_
New-alias focusmode pomodoro_timer
I have tried to limit the number of Start-Process running from a Powershell, but I can't seem to get it to work.
I tried to follow this process: https://exchange12rocks.org/2015/05/24/how-to-limit-a-number-of-powershell-jobs-running-simultaneously/ and Run N parallel jobs in powershell
But these are for Jobs not Processes, and I would like to remove the -Wait from the Start-Process
My concern with the script is that if there are 1000 audio files in the folder, then FFMpeg would crash the system.
# get the folder for conversion
function mbAudioConvert {
[Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms") | Out-Null
[System.Windows.Forms.Application]::EnableVisualStyles()
$fileBrowser = New-Object System.Windows.Forms.FolderBrowserDialog
$fileBrowser.SelectedPath = "B:\"
$fileBrowser.ShowNewFolderButton = $false
$fileBrowser.Description = "Select the folder with the audio which you wish to convert to Avid DNxHD 120 25P 48kHz"
$mbLoop = $true
$mbCount = 0001
$mbMaxJob = 4
while( $mbLoop ) {
if( $fileBrowser.ShowDialog() -eq "OK" ) {
$mbLoop = $false
$mbImage = ( Get-Item -Path "C:\Users\user\Desktop\lib\AudioOnly.jpg" )
$mbff32 = ( Get-Item -Path "C:\Users\user\Desktop\lib\ffmpeg32.exe" )
$mbff64 = ( Get-Item -Path "C:\Users\user\Desktop\lib\ffmpeg64.exe" )
$mbFolder = $fileBrowser.SelectedPath
$mbItemInc = ( ls $mbFolder\* -Include *.mp3, *.MP3, *.wav*, *.WAV*, *.ogg, *.OGG, *.wma, *.WMA, *.flac, *.FLAC, *.m4a, *.M4a )
$mbProgress = ( Get-ChildItem -Path $mbItemInc )
$mbHasRaw = ( $mbFolder + "\RAW" )
if( !( Test-Path -Path $mbHasRaw ) ) {
# force create a RAW folder if it does not exist
New-Item -ItemType Directory -Force -Path "$mbHasRaw"
}
foreach( $mbItem in $mbItemInc ) {
$mbCheck = $false
# output the progress
# Suggestion: You might want to consider updating this after starting the job and do the final update after running ex. Get-Job | Wait-Job to make the progress-bar stay until all processes are finished
#Write-Progress -Activity "Counting files for conversion" -status "Currently processing: $mbCount" -percentComplete ($mbCount / $mbItemInc.count*100)
# limit the run number
while ($mbCheck -eq $false) {
if( (Get-Job -State 'Running').count -lt $mbMaxJob) {
$mbScriptBlock = {
$mbItemName = $using:mbItem.BaseName
$mbNewItem = ( $using:mbFolder + "\RAW\" + $mbItemName + ".mov" )
$mbArgs = " -loop 1 -i $using:mbImage -i $using:mbItem -shortest -c:v dnxhd -b:v 120M -s 1920x1080 -pix_fmt yuv422p -r 25 -c:a pcm_s16le -ar 48k -af loudnorm=I=-12 $mbNewItem"
Start-Process -FilePath $using:mbff32 -ArgumentList $mbArgs -NoNewWindow -Wait
}
Start-Job -ScriptBlock $mbScriptBlock
#The job-thread doesn't know about $mbCount, better to increment it after starting the job
$mbCount++
$mbCheck = $true
}
}
}
} else {
$mbResponse = [System.Windows.Forms.MessageBox]::Show("You have exited out of the automation process!", "User has cancelled")
if( $mbResponse -eq "OK" ) {
return
}
}
}
$fileBrowser.SelectedPath
$fileBrowser.Dispose()
}
# call to function
mbAudioConvert
You edit $mbCheck, but the while loop is testing $Check which means the while-loop will never execute as $Check -eq $false is $false when $Check is not defined
Variables created outside the job script-block needs to be passed as an argument or you need to use the using: variable-scope to pass them in (PowerShell 3.0 or later). Added it to $mbItem, $mbff32, $mbImage and $mbFolder in the example which were not defined.
$mbMaxJob is not defined. The get running jobs-check will never be true and no processes will start
$mbCount not defined. Progress bar won't work
echo "$mbCount. $mbNewItem" won't return anything unless you use Receive-Job at some point to get the output from a job
Try:
#DemoValues
$mbItemInc = 1..10 | % { New-Item -ItemType File -Name "File$_.txt" }
$mbff32 = "something32"
$mbFolder = "c:\FooFolder"
$mbImage = "BarImage"
$mbMaxJob = 2
$mbCount = 0
foreach( $mbItem in $mbItemInc ) {
$mbCheck = $false
# output the progress
# Suggestion: You might want to consider updating this after starting the job and do the final update after running ex. Get-Job | Wait-Job to make the progress-bar stay until all processes are finished
Write-Progress -Activity "Counting files for conversion" -status "Currently processing: $mbCount" -percentComplete ($mbCount / $mbItemInc.count*100)
# limit the run number
while ($mbCheck -eq $false) {
if ((Get-Job -State 'Running').count -lt $mbMaxJob) {
$mbScriptBlock = {
Param($mbItem, $mbFolder, $mbImage, $mbff32)
#Filename without extension is already available in a FileInfo-object using the BaseName-property
$mbItemName = $mbItem.BaseName
$mbNewItem = ( $mbFolder + "\RAW\" + $mbItemName + ".mov" )
$mbArgs = "-loop 1 -i $mbImage -i $mbItem -shortest -c:v dnxhd -b:v 120M -s 1920x1080 -pix_fmt yuv422p -r 25 -c:a pcm_s16le -ar 48k -af loudnorm=I=-12 $mbNewItem"
Start-Process -FilePath $mbff32 -ArgumentList $mbArgs -NoNewWindow -Wait
}
Start-Job -ScriptBlock $mbScriptBlock -ArgumentList $mbItem, $mbFolder, $mbImage, $mbff32
#The job-thread doesn't know about $mbCount, better to increment it after starting the job
$mbCount++
$mbCheck = $true
}
}
}
I propose you my solution :
cls
$FormatNameJob="FFMPEG"
$maxConcurrentJobs = 100
$DirWithFile="C:\temp"
$DestFolder="C:\temp2"
$TraitmentDir="C:\temp\traitment"
$PathFFMpeg="C:\Temp\ffmpeg\ffmpeg\bin\ffmpeg.exe"
$HistoFolder="C:\temp\histo"
#create dir if dont exists
New-Item -ItemType Directory -Path $TraitmentDir -Force | Out-Null
New-Item -ItemType Directory -Path $DestFolder -Force | Out-Null
New-Item -ItemType Directory -Path $HistoFolder -Force | Out-Null
while ($true)
{
"Loop File"
$ListeFile=Get-ChildItem $DirWithFile -file -Filter "*.avi"
if ($ListeFile.count -eq 0 )
{
Start-Sleep -Seconds 1
continue
}
#loop file to trait
$ListeFile | %{
while ((get-job -State Running | where Name -eq $FormatNameJob ).Count -ge $maxConcurrentJobs)
{
Start-Sleep -Seconds 1
get-job -State Completed | where Name -eq $FormatNameJob | Remove-Job
}
"traitment file : {0}" -f $_.Name
#build newname and move item into traitment dir
$NewfileName="{0:yyyyMMddHHmmssfffff}_{1}" -f (get-date), $_.Name
$ItemTraitment=[io.path]::Combine($TraitmentDir, $NewfileName)
$mbNewItem ="{0}.mov" -f [io.path]::Combine($DestFolder, $_.BaseName)
Move-item $_.FullName -Destination $ItemTraitment
#build arguments and command
$mbArgs = " -loop 1 -i $ItemTraitment -shortest -c:v dnxhd -b:v 120M -s 1920x1080 -pix_fmt yuv422p -r 25 -c:a pcm_s16le -ar 48k -af loudnorm=I=-12 $mbNewItem"
$ScriptBlock=[scriptblock]::Create("Start-Process $PathFFMpeg -ArgumentList $mbArgs -Wait")
#add job
Start-Job -ScriptBlock $ScriptBlock -Name $FormatNameJob
}
}
So here is what I'm trying to do.
I have a Powershell script that calls bunch of batch files which installs software. Is there anyway to have a progress bar (GUI would be my choice) to track the status of those batch files that is being called?
Thanks in advance.
I found this on TechNet; the article was written by Ed Wilson.
When using the Write-Progress cmdlet, two parameters are required. The first is the activity parameter. The string supplied for this parameter appears on the first line in the progress dialog. The second required parameter is the status parameter. It appears under the Activity line.
I can provide an example from my PowerShell scripts. Used This for the timer.
The code below has a FOR loop to loop over items in a $variable.Count
$time = 7
$percentage = $i / $time
$remaining = New-TimeSpan -Seconds ($time - $i)
$message = "{0:p0} complete" -f $percentage, $remaining
Write-Progress -Activity "Working" -status $message -PercentComplete ($percentage * 100)
This example is slightly different and uses a ForEach to iterate over items in a collection and update a progress bar. Below will run and update a progress bar over 60 seconds.
$time = 60 # seconds
foreach($i in (1..$time)) {
$percentage = $i / $time
$remaining = New-TimeSpan -Seconds ($time - $i)
$message = "{0:p0} complete, remaining time {1}" -f $percentage, $remaining
Write-Progress -Activity "Wait for SCCM scan" -status $message -PercentComplete ($percentage * 100)
Start-Sleep 1
--Edit:
Here's code I came up with that successfully launches 5 batch files that each contain ping 1.1.1.1 -n 2 >NUL and the -n count increases to simulate time elapsed. *Note the PercentComplete option is misbehaving, and my inexperience really shines as I'm unsure it would work in this example. -edit, forgot to credit this post for getting the Write-Progress to work.
$commands = gc C:\test4\l.txt
$i = 0
foreach ($bat in $commands){
Start-Process cmd -ArgumentList "/c $bat" -Wait -WindowStyle Minimized
$i++
Write-Progress -Activity "Batch File Test" -Status "Completed: $i of $($commands.Count)" -PercentComplete (($i / $commands.Count) * 100)
}#END FOREACH
Write-Host "Batch files finished running!"
I'm writing a script that takes an output file from another platform (that sadly doesn't produce CSV output, instead it's around 7 lines per record), grabbing the lines that have the values I'm interested in (using select-string) and then scanning the MatchInfo array, extracting the exact text and building an array as I go, to export to CSV when finished.
My problem is that the original file has around 94000 lines of text, and the matchinfo object still has around 23500 records in it, so it takes a while, especially building the array, so I thought I'd throw in a Write-Progress but the overhead in doing so is quite horrific, it increases the elapsed time x8 vs not having the progress bar.
Here's an example entry from the original file:
CREATE TRANCODE MPF OF TXOLID
AGENDA = T4XCLCSHINAG
,ANY_SC_LIST = NONE ,EVERY_SC_LIST = NONE
,SECURITY_CATEGORY = NONE ,FUNCTION = 14
,TRANCODE_VALUE = "MPF"
,TRANCODE_FUNCTION_MNEMONIC = NONE
,INSTALLATION_DATA = NONE
;
Now, for each of these, I only care about the values of AGENDA and TRANCODE_VALUE, so having read the file in using Get-Content, I then use Select-String as the most efficient way I know to filter out the rest of the lines in the file:
rv Start,Filtered,count,CSV
Write-Host "Reading Mainframe Extract File"
$Start = gc K:\TRANCODES.txt
Write-Host ("Read Complete : " + $Start.Count + " records found")
Write-Host "Filtering records for AGENDA/TRANCODE information"
$Filtered = $Start|Select-String -Pattern "AGENDA","TRANCODE_VALUE"
Write-Host ([String]($Filtered.Count/2) + " AGENDA/TRANCODE pairs found")
This leaves me with an object of type Microsoft.PowerShell.Commands.MatchInfo with contents like:
AGENDA = T4XCLCSHINAG
,TRANCODE_VALUE = "MPF"
AGENDA = T4XCLCSHINAG
,TRANCODE_VALUE = "MP"
Now that Select-String only took around 9 seconds, so no real need for a progress bar there.
However, the next step, grabbing the actual values (after the =) and putting in an array takes over 30 seconds, so I figured a Write-Progress is helpful to the user and at least shows that something is actually happening, but, the addition of the progress bar seriously extends the elapsed time, see the following output from Measure-Command:
Measure-Command{$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
<#$count++
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100))#>
}}
TotalSeconds : 32.7902523
So that's 717.2308630680085 records/sec
Measure-Command{$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
$count++
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100))
}}
TotalSeconds : 261.3469632
Now only a paltry 89.98660799693897 records/sec
Any ideas how to improve the efficiency?
Here's the full script as-is:
rv Start,Filtered,count,CSV
Write-Host "Reading Mainframe Extract File"
$Start = gc K:\TRANCODES.txt
Write-Host ("Read Complete : " + $Start.Count + " records found")
Write-Host "Filtering records for AGENDA/TRANCODE information"
$Filtered = $Start|Select-String -Pattern "AGENDA","TRANCODE_VALUE"
Write-Host ([String]($Filtered.Count/2) + " AGENDA/TRANCODE pairs found")
Write-Host "Building table from the filter results"
[int]$count = 0
$CSV = #()
$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
$count++
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100))
}
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Table built : " + $CSV.Count + " rows created") `
-Id 1 `
-Completed
Write-Host ("Table built : " + $CSV.Count + " rows created")
Write-Host "Sorting and Exporting table to CSV file"
$CSV|Select TRANCODE,AGENDA|Sort TRANCODE |Export-CSV -notype K:\TRANCODES.CSV
Here's output from script with the write-progress commented out:
Reading Mainframe Extract File
Read Complete : 94082 records found
Filtering records for AGENDA/TRANCODE information
11759 AGENDA/TRANCODE pairs found
Building table from the filter results
Table built : 11759 rows created
Sorting and Exporting table to CSV file
TotalSeconds : 75.2279182
EDIT:
I've adopted a modified version of the answer from #RomanKuzmin, so the appropriate code section now looks like:
Write-Host "Building table from the filter results"
[int]$count = 0
$CSV = #()
$sw = [System.Diagnostics.Stopwatch]::StartNew()
$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
$count++
If ($sw.Elapsed.TotalMilliseconds -ge 500) {
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100));
$sw.Reset();
$sw.Start()}
}
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Table built : " + $CSV.Count + " rows created") `
-Id 1 `
-Completed
And running the entire script through Measure-Command gives elapsed time of 75.2279182 seconds with no write-progress and with the modified write-progress using #RomanKuzmin suggestion, 76.525382 seconds - not bad at all!! :-)
In such cases when progress is called too often I use this approach
# fast even with Write-Progress
$sw = [System.Diagnostics.Stopwatch]::StartNew()
for($e = 0; $e -lt 1mb; ++$e) {
if ($sw.Elapsed.TotalMilliseconds -ge 500) {
Write-Progress -Activity Test -Status "Done $e"
$sw.Reset(); $sw.Start()
}
}
# very slow due to Write-Progress
for($e = 0; $e -lt 1mb; ++$e) {
Write-Progress -Activity Test -Status "Done $e"
}
Here is the suggestion on Connect....
I hope this helps someone else.
I spent a day on a similar problem: Progress bar was very very slow.
My problem however was rooted in the fact that I had made the screenbuffer for the powershell console extremely wide (9999 instead of the default 120).
This caused Write-Progress to be slowed to the extreme every time it had to update the gui progress bar.
I wanted to use write-progress to monitor the piping of get-child-item to file. The solution was to start a new job and then monitor the output of the job for change from another process. Powershell makes this quite easy.
# start the job to write the file index to the cache
$job = start-job {
param($path)
Get-ChildItem -Name -Attributes !D -Recurse $path > $path/.hscache
} -arg $(pwd)
# Wake every 200 ms and print the progress to the screen until the job is finished
while( $job.State -ne "Completed") {
Write-Progress -Activity ".hscache-build " -Status $(get-childitem .hscache).length
sleep -m 200
}
# clear the progress bar
Write-Progress -Activity ".hscache-build" -Completed
I completely removed my old answer for sake of efficiency, Although modulus checks are efficient enough, they do take time, especially if doing a modulus 20 against say 5 million - this adds a decent amount of overhead.
For loops, all I do is something simple as follows
---which is similar to the stop watch method, is reset your progress check with each write-progress:
$totalDone=0
$finalCount = $objects.count
$progressUpdate = [math]::floor($finalCount / 100)
$progressCheck = $progressUpdate+1
foreach ($object in $objects) {
<<do something with $object>>
$totalDone+=1
If ($progressCheck -gt $progressUpdate){
write-progress -activity "$totalDone out of $finalCount completed" -PercentComplete $(($totalDone / $finalCount) * 100)
$progressCheck = 0
}
$progressCheck += 1
}
The reason I set $progressCheck to $progressUpdate+1 is because it will run the first time through the loop.
This method will run a progress update every 1% of completion. If you want more or less, just update the division from 100 to your prefered number. 200 would mean an update every 0.5% and 50 would mean every 2%
Is there any way to copy a really large file (from one server to another) in PowerShell AND display its progress?
There are solutions out there to use Write-Progress in conjunction with looping to copy many files and display progress. However I can't seem to find anything that would show progress of a single file.
Any thoughts?
It seems like a much better solution to just use BitsTransfer, it seems to come OOTB on most Windows machines with PowerShell 2.0 or greater.
Import-Module BitsTransfer
Start-BitsTransfer -Source $Source -Destination $Destination -Description "Backup" -DisplayName "Backup"
I haven't heard about progress with Copy-Item. If you don't want to use any external tool, you can experiment with streams. The size of buffer varies, you may try different values (from 2kb to 64kb).
function Copy-File {
param( [string]$from, [string]$to)
$ffile = [io.file]::OpenRead($from)
$tofile = [io.file]::OpenWrite($to)
Write-Progress -Activity "Copying file" -status "$from -> $to" -PercentComplete 0
try {
[byte[]]$buff = new-object byte[] 4096
[long]$total = [int]$count = 0
do {
$count = $ffile.Read($buff, 0, $buff.Length)
$tofile.Write($buff, 0, $count)
$total += $count
if ($total % 1mb -eq 0) {
Write-Progress -Activity "Copying file" -status "$from -> $to" `
-PercentComplete ([long]($total * 100 / $ffile.Length))
}
} while ($count -gt 0)
}
finally {
$ffile.Dispose()
$tofile.Dispose()
Write-Progress -Activity "Copying file" -Status "Ready" -Completed
}
}
Alternativly this option uses the native windows progress bar...
$FOF_CREATEPROGRESSDLG = "&H0&"
$objShell = New-Object -ComObject "Shell.Application"
$objFolder = $objShell.NameSpace($DestLocation)
$objFolder.CopyHere($srcFile, $FOF_CREATEPROGRESSDLG)
cmd /c copy /z src dest
not pure PowerShell, but executable in PowerShell and it displays progress in percents
I amended the code from stej (which was great, just what i needed!) to use larger buffer, [long] for larger files and used System.Diagnostics.Stopwatch class to track elapsed time and estimate time remaining.
Also added reporting of transfer rate during transfer and outputting overall elapsed time and overall transfer rate.
Using 4MB (4096*1024 bytes) buffer to get better than Win7 native throughput copying from NAS to USB stick on laptop over wifi.
On To-Do list:
add error handling (catch)
handle get-childitem file list as input
nested progress bars when copying multiple files (file x of y, % if
total data copied etc)
input parameter for buffer size
Feel free to use/improve :-)
function Copy-File {
param( [string]$from, [string]$to)
$ffile = [io.file]::OpenRead($from)
$tofile = [io.file]::OpenWrite($to)
Write-Progress `
-Activity "Copying file" `
-status ($from.Split("\")|select -last 1) `
-PercentComplete 0
try {
$sw = [System.Diagnostics.Stopwatch]::StartNew();
[byte[]]$buff = new-object byte[] (4096*1024)
[long]$total = [long]$count = 0
do {
$count = $ffile.Read($buff, 0, $buff.Length)
$tofile.Write($buff, 0, $count)
$total += $count
[int]$pctcomp = ([int]($total/$ffile.Length* 100));
[int]$secselapsed = [int]($sw.elapsedmilliseconds.ToString())/1000;
if ( $secselapsed -ne 0 ) {
[single]$xferrate = (($total/$secselapsed)/1mb);
} else {
[single]$xferrate = 0.0
}
if ($total % 1mb -eq 0) {
if($pctcomp -gt 0)`
{[int]$secsleft = ((($secselapsed/$pctcomp)* 100)-$secselapsed);
} else {
[int]$secsleft = 0};
Write-Progress `
-Activity ($pctcomp.ToString() + "% Copying file # " + "{0:n2}" -f $xferrate + " MB/s")`
-status ($from.Split("\")|select -last 1) `
-PercentComplete $pctcomp `
-SecondsRemaining $secsleft;
}
} while ($count -gt 0)
$sw.Stop();
$sw.Reset();
}
finally {
write-host (($from.Split("\")|select -last 1) + `
" copied in " + $secselapsed + " seconds at " + `
"{0:n2}" -f [int](($ffile.length/$secselapsed)/1mb) + " MB/s.");
$ffile.Close();
$tofile.Close();
}
}
Not that I'm aware of. I wouldn't recommend using copy-item for this anyway. I don't think it has been designed to be robust like robocopy.exe to support retry which you would want for extremely large file copies over the network.
i found none of the examples above met my needs, i wanted to copy a directory with sub directories, the problem is my source directory had too many files so i quickly hit the BITS file limit (i had > 1500 file) also the total directory size was quite large.
i found a function using robocopy that was a good starting point at https://keithga.wordpress.com/2014/06/23/copy-itemwithprogress/, however i found it wasn't quite robust enough, it didn't handle trailing slashes, spaces gracefully and did not stop the copy when the script was halted.
Here is my refined version:
function Copy-ItemWithProgress
{
<#
.SYNOPSIS
RoboCopy with PowerShell progress.
.DESCRIPTION
Performs file copy with RoboCopy. Output from RoboCopy is captured,
parsed, and returned as Powershell native status and progress.
.PARAMETER Source
Directory to copy files from, this should not contain trailing slashes
.PARAMETER Destination
DIrectory to copy files to, this should not contain trailing slahes
.PARAMETER FilesToCopy
A wildcard expresion of which files to copy, defaults to *.*
.PARAMETER RobocopyArgs
List of arguments passed directly to Robocopy.
Must not conflict with defaults: /ndl /TEE /Bytes /NC /nfl /Log
.PARAMETER ProgressID
When specified (>=0) will use this identifier for the progress bar
.PARAMETER ParentProgressID
When specified (>= 0) will use this identifier as the parent ID for progress bars
so that they appear nested which allows for usage in more complex scripts.
.OUTPUTS
Returns an object with the status of final copy.
REMINDER: Any error level below 8 can be considered a success by RoboCopy.
.EXAMPLE
C:\PS> .\Copy-ItemWithProgress c:\Src d:\Dest
Copy the contents of the c:\Src directory to a directory d:\Dest
Without the /e or /mir switch, only files from the root of c:\src are copied.
.EXAMPLE
C:\PS> .\Copy-ItemWithProgress '"c:\Src Files"' d:\Dest /mir /xf *.log -Verbose
Copy the contents of the 'c:\Name with Space' directory to a directory d:\Dest
/mir and /XF parameters are passed to robocopy, and script is run verbose
.LINK
https://keithga.wordpress.com/2014/06/23/copy-itemwithprogress
.NOTES
By Keith S. Garner (KeithGa#KeithGa.com) - 6/23/2014
With inspiration by Trevor Sullivan #pcgeek86
Tweaked by Justin Marshall - 02/20/2020
#>
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]$Source,
[Parameter(Mandatory=$true)]
[string]$Destination,
[Parameter(Mandatory=$false)]
[string]$FilesToCopy="*.*",
[Parameter(Mandatory = $true,ValueFromRemainingArguments=$true)]
[string[]] $RobocopyArgs,
[int]$ParentProgressID=-1,
[int]$ProgressID=-1
)
#handle spaces and trailing slashes
$SourceDir = '"{0}"' -f ($Source -replace "\\+$","")
$TargetDir = '"{0}"' -f ($Destination -replace "\\+$","")
$ScanLog = [IO.Path]::GetTempFileName()
$RoboLog = [IO.Path]::GetTempFileName()
$ScanArgs = #($SourceDir,$TargetDir,$FilesToCopy) + $RobocopyArgs + "/ndl /TEE /bytes /Log:$ScanLog /nfl /L".Split(" ")
$RoboArgs = #($SourceDir,$TargetDir,$FilesToCopy) + $RobocopyArgs + "/ndl /TEE /bytes /Log:$RoboLog /NC".Split(" ")
# Launch Robocopy Processes
write-verbose ("Robocopy Scan:`n" + ($ScanArgs -join " "))
write-verbose ("Robocopy Full:`n" + ($RoboArgs -join " "))
$ScanRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $ScanArgs
try
{
$RoboRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $RoboArgs
try
{
# Parse Robocopy "Scan" pass
$ScanRun.WaitForExit()
$LogData = get-content $ScanLog
if ($ScanRun.ExitCode -ge 8)
{
$LogData|out-string|Write-Error
throw "Robocopy $($ScanRun.ExitCode)"
}
$FileSize = [regex]::Match($LogData[-4],".+:\s+(\d+)\s+(\d+)").Groups[2].Value
write-verbose ("Robocopy Bytes: $FileSize `n" +($LogData -join "`n"))
#determine progress parameters
$ProgressParms=#{}
if ($ParentProgressID -ge 0) {
$ProgressParms['ParentID']=$ParentProgressID
}
if ($ProgressID -ge 0) {
$ProgressParms['ID']=$ProgressID
} else {
$ProgressParms['ID']=$RoboRun.Id
}
# Monitor Full RoboCopy
while (!$RoboRun.HasExited)
{
$LogData = get-content $RoboLog
$Files = $LogData -match "^\s*(\d+)\s+(\S+)"
if ($null -ne $Files )
{
$copied = ($Files[0..($Files.Length-2)] | ForEach-Object {$_.Split("`t")[-2]} | Measure-Object -sum).Sum
if ($LogData[-1] -match "(100|\d?\d\.\d)\%")
{
write-progress Copy -ParentID $ProgressParms['ID'] -percentComplete $LogData[-1].Trim("% `t") $LogData[-1]
$Copied += $Files[-1].Split("`t")[-2] /100 * ($LogData[-1].Trim("% `t"))
}
else
{
write-progress Copy -ParentID $ProgressParms['ID'] -Complete
}
write-progress ROBOCOPY -PercentComplete ($Copied/$FileSize*100) $Files[-1].Split("`t")[-1] #ProgressParms
}
}
} finally {
if (!$RoboRun.HasExited) {Write-Warning "Terminating copy process with ID $($RoboRun.Id)..."; $RoboRun.Kill() ; }
$RoboRun.WaitForExit()
# Parse full RoboCopy pass results, and cleanup
(get-content $RoboLog)[-11..-2] | out-string | Write-Verbose
remove-item $RoboLog
write-output ([PSCustomObject]#{ ExitCode = $RoboRun.ExitCode })
}
} finally {
if (!$ScanRun.HasExited) {Write-Warning "Terminating scan process with ID $($ScanRun.Id)..."; $ScanRun.Kill() }
$ScanRun.WaitForExit()
remove-item $ScanLog
}
}
Hate to be the one to bump an old subject, but I found this post extremely useful. After running performance tests on the snippets by stej and it's refinement by Graham Gold, plus the BITS suggestion by Nacht, I have decuded that:
I really liked Graham's command with time estimations and speed readings.
I also really liked the significant speed increase of using BITS as my transfer method.
Faced with the decision between the two... I found that Start-BitsTransfer supported Asynchronous mode. So here is the result of my merging the two.
function Copy-File {
# ref: https://stackoverflow.com/a/55527732/3626361
param([string]$From, [string]$To)
try {
$job = Start-BitsTransfer -Source $From -Destination $To `
-Description "Moving: $From => $To" `
-DisplayName "Backup" -Asynchronous
# Start stopwatch
$sw = [System.Diagnostics.Stopwatch]::StartNew()
Write-Progress -Activity "Connecting..."
while ($job.JobState.ToString() -ne "Transferred") {
switch ($job.JobState.ToString()) {
"Connecting" {
break
}
"Transferring" {
$pctcomp = ($job.BytesTransferred / $job.BytesTotal) * 100
$elapsed = ($sw.elapsedmilliseconds.ToString()) / 1000
if ($elapsed -eq 0) {
$xferrate = 0.0
}
else {
$xferrate = (($job.BytesTransferred / $elapsed) / 1mb);
}
if ($job.BytesTransferred % 1mb -eq 0) {
if ($pctcomp -gt 0) {
$secsleft = ((($elapsed / $pctcomp) * 100) - $elapsed)
}
else {
$secsleft = 0
}
Write-Progress -Activity ("Copying file '" + ($From.Split("\") | Select-Object -last 1) + "' # " + "{0:n2}" -f $xferrate + "MB/s") `
-PercentComplete $pctcomp `
-SecondsRemaining $secsleft
}
break
}
"Transferred" {
break
}
Default {
throw $job.JobState.ToString() + " unexpected BITS state."
}
}
}
$sw.Stop()
$sw.Reset()
}
finally {
Complete-BitsTransfer -BitsJob $job
Write-Progress -Activity "Completed" -Completed
}
}
This recursive function copies files and directories recursively from source path to destination path
If file already exists on destination path, it copies them only with newer files.
Function Copy-FilesBitsTransfer(
[Parameter(Mandatory=$true)][String]$sourcePath,
[Parameter(Mandatory=$true)][String]$destinationPath,
[Parameter(Mandatory=$false)][bool]$createRootDirectory = $true)
{
$item = Get-Item $sourcePath
$itemName = Split-Path $sourcePath -leaf
if (!$item.PSIsContainer){ #Item Is a file
$clientFileTime = Get-Item $sourcePath | select LastWriteTime -ExpandProperty LastWriteTime
if (!(Test-Path -Path $destinationPath\$itemName)){
Start-BitsTransfer -Source $sourcePath -Destination $destinationPath -Description "$sourcePath >> $destinationPath" -DisplayName "Copy Template file" -Confirm:$false
if (!$?){
return $false
}
}
else{
$serverFileTime = Get-Item $destinationPath\$itemName | select LastWriteTime -ExpandProperty LastWriteTime
if ($serverFileTime -lt $clientFileTime)
{
Start-BitsTransfer -Source $sourcePath -Destination $destinationPath -Description "$sourcePath >> $destinationPath" -DisplayName "Copy Template file" -Confirm:$false
if (!$?){
return $false
}
}
}
}
else{ #Item Is a directory
if ($createRootDirectory){
$destinationPath = "$destinationPath\$itemName"
if (!(Test-Path -Path $destinationPath -PathType Container)){
if (Test-Path -Path $destinationPath -PathType Leaf){ #In case item is a file, delete it.
Remove-Item -Path $destinationPath
}
New-Item -ItemType Directory $destinationPath | Out-Null
if (!$?){
return $false
}
}
}
Foreach ($fileOrDirectory in (Get-Item -Path "$sourcePath\*"))
{
$status = Copy-FilesBitsTransfer $fileOrDirectory $destinationPath $true
if (!$status){
return $false
}
}
}
return $true
}
Sean Kearney from the Hey, Scripting Guy! Blog has a solution I found works pretty nicely.
Function Copy-WithProgress
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$Source,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$Destination
)
$Source=$Source.tolower()
$Filelist=Get-Childitem "$Source" –Recurse
$Total=$Filelist.count
$Position=0
foreach ($File in $Filelist)
{
$Filename=$File.Fullname.tolower().replace($Source,'')
$DestinationFile=($Destination+$Filename)
Write-Progress -Activity "Copying data from '$source' to '$Destination'" -Status "Copying File $Filename" -PercentComplete (($Position/$total)*100)
Copy-Item $File.FullName -Destination $DestinationFile
$Position++
}
}
Then to use it:
Copy-WithProgress -Source $src -Destination $dest
Trevor Sullivan has a write-up on how to add a command called Copy-ItemWithProgress to PowerShell on Robocopy.