I'm writing a script that takes an output file from another platform (that sadly doesn't produce CSV output, instead it's around 7 lines per record), grabbing the lines that have the values I'm interested in (using select-string) and then scanning the MatchInfo array, extracting the exact text and building an array as I go, to export to CSV when finished.
My problem is that the original file has around 94000 lines of text, and the matchinfo object still has around 23500 records in it, so it takes a while, especially building the array, so I thought I'd throw in a Write-Progress but the overhead in doing so is quite horrific, it increases the elapsed time x8 vs not having the progress bar.
Here's an example entry from the original file:
CREATE TRANCODE MPF OF TXOLID
AGENDA = T4XCLCSHINAG
,ANY_SC_LIST = NONE ,EVERY_SC_LIST = NONE
,SECURITY_CATEGORY = NONE ,FUNCTION = 14
,TRANCODE_VALUE = "MPF"
,TRANCODE_FUNCTION_MNEMONIC = NONE
,INSTALLATION_DATA = NONE
;
Now, for each of these, I only care about the values of AGENDA and TRANCODE_VALUE, so having read the file in using Get-Content, I then use Select-String as the most efficient way I know to filter out the rest of the lines in the file:
rv Start,Filtered,count,CSV
Write-Host "Reading Mainframe Extract File"
$Start = gc K:\TRANCODES.txt
Write-Host ("Read Complete : " + $Start.Count + " records found")
Write-Host "Filtering records for AGENDA/TRANCODE information"
$Filtered = $Start|Select-String -Pattern "AGENDA","TRANCODE_VALUE"
Write-Host ([String]($Filtered.Count/2) + " AGENDA/TRANCODE pairs found")
This leaves me with an object of type Microsoft.PowerShell.Commands.MatchInfo with contents like:
AGENDA = T4XCLCSHINAG
,TRANCODE_VALUE = "MPF"
AGENDA = T4XCLCSHINAG
,TRANCODE_VALUE = "MP"
Now that Select-String only took around 9 seconds, so no real need for a progress bar there.
However, the next step, grabbing the actual values (after the =) and putting in an array takes over 30 seconds, so I figured a Write-Progress is helpful to the user and at least shows that something is actually happening, but, the addition of the progress bar seriously extends the elapsed time, see the following output from Measure-Command:
Measure-Command{$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
<#$count++
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100))#>
}}
TotalSeconds : 32.7902523
So that's 717.2308630680085 records/sec
Measure-Command{$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
$count++
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100))
}}
TotalSeconds : 261.3469632
Now only a paltry 89.98660799693897 records/sec
Any ideas how to improve the efficiency?
Here's the full script as-is:
rv Start,Filtered,count,CSV
Write-Host "Reading Mainframe Extract File"
$Start = gc K:\TRANCODES.txt
Write-Host ("Read Complete : " + $Start.Count + " records found")
Write-Host "Filtering records for AGENDA/TRANCODE information"
$Filtered = $Start|Select-String -Pattern "AGENDA","TRANCODE_VALUE"
Write-Host ([String]($Filtered.Count/2) + " AGENDA/TRANCODE pairs found")
Write-Host "Building table from the filter results"
[int]$count = 0
$CSV = #()
$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
$count++
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100))
}
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Table built : " + $CSV.Count + " rows created") `
-Id 1 `
-Completed
Write-Host ("Table built : " + $CSV.Count + " rows created")
Write-Host "Sorting and Exporting table to CSV file"
$CSV|Select TRANCODE,AGENDA|Sort TRANCODE |Export-CSV -notype K:\TRANCODES.CSV
Here's output from script with the write-progress commented out:
Reading Mainframe Extract File
Read Complete : 94082 records found
Filtering records for AGENDA/TRANCODE information
11759 AGENDA/TRANCODE pairs found
Building table from the filter results
Table built : 11759 rows created
Sorting and Exporting table to CSV file
TotalSeconds : 75.2279182
EDIT:
I've adopted a modified version of the answer from #RomanKuzmin, so the appropriate code section now looks like:
Write-Host "Building table from the filter results"
[int]$count = 0
$CSV = #()
$sw = [System.Diagnostics.Stopwatch]::StartNew()
$Filtered|foreach {If ($_.ToString() -Match 'AGENDA'){$obj = $null;
$obj = New-Object System.Object;
$obj | Add-Member -type NoteProperty -name AGENDA -Value $_.ToString().SubString(27)}
If ($_.ToString() -Match 'TRANCODE_VALUE'){$obj | Add-Member -type NoteProperty -name TRANCODE -Value ($_.ToString().SubString(28)).Replace('"','');
$CSV += $obj;
$obj = $null}
$count++
If ($sw.Elapsed.TotalMilliseconds -ge 500) {
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Processed " + $count + " of " + $Filtered.Count + " records") `
-Id 1 `
-PercentComplete ([int]($count/$Filtered.Count *100));
$sw.Reset();
$sw.Start()}
}
Write-Progress `
-Activity "Building table of values from filter results" `
-Status ("Table built : " + $CSV.Count + " rows created") `
-Id 1 `
-Completed
And running the entire script through Measure-Command gives elapsed time of 75.2279182 seconds with no write-progress and with the modified write-progress using #RomanKuzmin suggestion, 76.525382 seconds - not bad at all!! :-)
In such cases when progress is called too often I use this approach
# fast even with Write-Progress
$sw = [System.Diagnostics.Stopwatch]::StartNew()
for($e = 0; $e -lt 1mb; ++$e) {
if ($sw.Elapsed.TotalMilliseconds -ge 500) {
Write-Progress -Activity Test -Status "Done $e"
$sw.Reset(); $sw.Start()
}
}
# very slow due to Write-Progress
for($e = 0; $e -lt 1mb; ++$e) {
Write-Progress -Activity Test -Status "Done $e"
}
Here is the suggestion on Connect....
I hope this helps someone else.
I spent a day on a similar problem: Progress bar was very very slow.
My problem however was rooted in the fact that I had made the screenbuffer for the powershell console extremely wide (9999 instead of the default 120).
This caused Write-Progress to be slowed to the extreme every time it had to update the gui progress bar.
I wanted to use write-progress to monitor the piping of get-child-item to file. The solution was to start a new job and then monitor the output of the job for change from another process. Powershell makes this quite easy.
# start the job to write the file index to the cache
$job = start-job {
param($path)
Get-ChildItem -Name -Attributes !D -Recurse $path > $path/.hscache
} -arg $(pwd)
# Wake every 200 ms and print the progress to the screen until the job is finished
while( $job.State -ne "Completed") {
Write-Progress -Activity ".hscache-build " -Status $(get-childitem .hscache).length
sleep -m 200
}
# clear the progress bar
Write-Progress -Activity ".hscache-build" -Completed
I completely removed my old answer for sake of efficiency, Although modulus checks are efficient enough, they do take time, especially if doing a modulus 20 against say 5 million - this adds a decent amount of overhead.
For loops, all I do is something simple as follows
---which is similar to the stop watch method, is reset your progress check with each write-progress:
$totalDone=0
$finalCount = $objects.count
$progressUpdate = [math]::floor($finalCount / 100)
$progressCheck = $progressUpdate+1
foreach ($object in $objects) {
<<do something with $object>>
$totalDone+=1
If ($progressCheck -gt $progressUpdate){
write-progress -activity "$totalDone out of $finalCount completed" -PercentComplete $(($totalDone / $finalCount) * 100)
$progressCheck = 0
}
$progressCheck += 1
}
The reason I set $progressCheck to $progressUpdate+1 is because it will run the first time through the loop.
This method will run a progress update every 1% of completion. If you want more or less, just update the division from 100 to your prefered number. 200 would mean an update every 0.5% and 50 would mean every 2%
Related
I'm doing a BITS transfer of daily imagery from a web server and I keep getting random drops during the transfer.
As it's cycling through the downloads I get the occasional "The connection was closed prematurely" or "An error occurred in the secure channel support". There are about 180 images in each folder and this happens for maybe 5-10% of them. I need to retry the download for those that didn't complete.
My code to do so follows - my imperfect work-around is to run the loop twice but I'm hoping to find a better solution.
# Set the URL where the images are located
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
# Set the local path where the images will be stored
$path = 'C:\images\Wind_Waves\latest\'
# Create a list of all assets returned from $url
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links | Where-Object{ $_.tagName -eq 'A' -and $_.href.ToLower().EndsWith("jpg") }
# Create a list of all href items from the table & call it $images
$images = $table.href
# Enumerate all of the images - for troubleshooting purposes - can be removed
$images
# Check to make sure there are images available for download - arbitrarily picked more than 2 $images
if($images.count -gt 2){
# Delete all of the files in the "latest" folder
Remove-Item ($path + "*.*") -Force
# For loop to check to see if we already have the image and, if not, download it
ForEach ($image in $images)
{
if(![System.IO.File]::Exists($path + $image)){
Write-Output "Downloading: " $image
Start-BitsTransfer -Source ($url + $image) -Destination $path -TransferType Download -RetryInterval 60
Start-Sleep 2
}
}
Get-BitsTransfer | Where-Object {$_.JobState -eq "Transferred"} | Complete-BitsTransfer
} else {
Write-Output "No images to download"}
I don't see any error handling in your code to resume/retry/restart on fail.
Meaning why is there no try/catch in the loop or the Get?
If the Get is on per download job in the loop, why is it outside the loop?
Download is the default for TransferType, so no need to specify, it normally will generate an error if you do.
So, something like this. I did test this, but never got a fail. Yet, I have a very high-speed speed internet connection. If you are doing this inside an enterprise, edge devices (filters, proxies, could also be slowing things down, potentially forcing timeouts.)
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed. Attempting a resume/retry" -Verbose
Get-BitsTransfer -Name $image | Resume-BitsTransfer
}
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
See the help files
Resume-BitsTransfer
Module: bitstransfer Resumes a BITS transfer job.
# Example 1: Resume all BITS transfer jobs owned by the current user
Get-BitsTransfer | Resume-BitsTransfer
# Example 2: Resume a new BITS transfer job that was initially suspended
$Bits = Start-BitsTransfer -DisplayName "MyJob" -Suspended
Add-BitsTransfer -BitsJob $Bits -ClientFileName C:\myFile -ServerFileName http://www.SomeSiteName.com/file1
Resume-BitsTransfer -BitsJob $Bits -Asynchronous
# Example 3: Resume the BITS transfer by the specified display name
Get-BitsTransfer -Name "TestJob01" | Resume-BitsTransfer
Here's a somewhat modified version of the above code. It appears the BITS transfer job object goes away when the error occurs, so there is no use in trying to find/resume that job. Instead, I wrapped the entire Try-Catch block in a while loop with an exit when the file is downloaded.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
$MaxRetries = 3 # Initialize the maximum number of retry attempts.
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object {
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if ($images.count -gt 2) {
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images) {
# Due to occasional failures to transfer, wrap the BITS transfer in a while loop
# re-initialize the exit counter for each new image
$retryCount = 0
while ($retryCount -le $MaxRetries){
Try {
Write-Verbose -Message "Downloading: $image" -Verbose
if (![System.IO.File]::Exists($path + $image)) {
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
# To get here, the transfer must have finished, so set the counter
# greater than the max value to exit the loop
$retryCount = $MaxRetries + 1
} # End Try block
Catch {
$PSItem.Exception.Message
$retryCount += 1
Write-Warning -Message "Download of $image not complete or failed. Attempting retry #: $retryCount" -Verbose
} # End Catch Block
} # End While loop for retries
} # End of loop over images
} # End of test for new images
else {
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
} # End of result for no new images
Here is a combination of the code that postanote provided and a Do-While loop to retry the download up to 5x if an error is thrown.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<# Check to make sure there are images available for download - arbitrarily
picked more than 2 $images #>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
# Create a Do-While loop to retry downloads up to 5 times if they fail
$Stoploop = $false
[int]$Retrycount = "0"
do{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 10
$Stoploop = $true
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
if ($Retrycount -gt 5){
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed." -Verbose
$Stoploop = $true
}
else {
Write-Host "Could not download the image, retrying..."
Start-Sleep 10
$Retrycount = $Retrycount + 1
}
}
}
While ($Stoploop -eq $false)
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
I recently finished a script to ping every computer/workstation on a list and output it in a nice format.
There are thousands of computers to ping on the next list so it would take a while to run normally. How might I be able to multithread this so the job can be completed within a reasonable time-frame?
workflow Ping
{
param($computers)
foreach -parallel ($computer in $computers)
{
$status = Test-Connection -ComputerName $computer -Count 1 -Quiet
if (!$status)
{ Write-Output "Could not ping $computer" }
}
}
$computers = #(
"wd1600023",
"sleipnir"
)
Ping $computers
You could spawn multiple jobs
or you could use a workflow and a Foreach -parallel loop.
I found and used this script as a baseline for a similar function Ping-IPRange.
I have modified it from the original and included it below.
You could easily adapt this to take in a list of hostnames if that is what you are using.
Ping-IPRange.ps1
function Ping-IPRange
{
<#
.SYNOPSIS
Sends ICMP echo request packets to a range of IPv4 addresses between two given addresses.
.DESCRIPTION
This function lets you sends ICMP echo request packets ("pings") to
a range of IPv4 addresses using an asynchronous method.
Therefore this technique is very fast but comes with a warning.
Ping sweeping a large subnet or network with many switches may result in
a peak of broadcast traffic.
Use the -Interval parameter to adjust the time between each ping request.
For example, an interval of 60 milliseconds is suitable for wireless networks.
The RawOutput parameter switches the output to an unformatted
[System.Net.NetworkInformation.PingReply[]].
.INPUTS
None
You cannot pipe input to this function.
.OUTPUTS
The function only returns output from successful pings.
Type: System.Net.NetworkInformation.PingReply
The RawOutput parameter switches the output to an unformatted
[System.Net.NetworkInformation.PingReply[]].
.NOTES
Author : G.A.F.F. Jakobs
Created : August 30, 2014
Version : 6
Revision History: Kory Gill, 2016/01/09
formatting
added better error handling
close progress indicator when complete
.EXAMPLE
Ping-IPRange -StartAddress 192.168.1.1 -EndAddress 192.168.1.254 -Interval 20
IPAddress Bytes Ttl ResponseTime
--------- ----- --- ------------
192.168.1.41 32 64 371
192.168.1.57 32 128 0
192.168.1.64 32 128 1
192.168.1.63 32 64 88
192.168.1.254 32 64 0
In this example all the ip addresses between 192.168.1.1 and 192.168.1.254 are pinged using
a 20 millisecond interval between each request.
All the addresses that reply the ping request are listed.
.LINK
http://gallery.technet.microsoft.com/Fast-asynchronous-ping-IP-d0a5cf0e
#>
[CmdletBinding(ConfirmImpact='Low')]
Param(
[parameter(Mandatory = $true, Position = 0)]
[System.Net.IPAddress]$StartAddress,
[parameter(Mandatory = $true, Position = 1)]
[System.Net.IPAddress]$EndAddress,
[int]$Interval = 30,
[Switch]$RawOutput = $false
)
$timeout = 2000
function New-Range ($start, $end) {
[byte[]]$BySt = $start.GetAddressBytes()
[Array]::Reverse($BySt)
[byte[]]$ByEn = $end.GetAddressBytes()
[Array]::Reverse($ByEn)
$i1 = [System.BitConverter]::ToUInt32($BySt,0)
$i2 = [System.BitConverter]::ToUInt32($ByEn,0)
for ($x = $i1;$x -le $i2;$x++)
{
$ip = ([System.Net.IPAddress]$x).GetAddressBytes()
[Array]::Reverse($ip)
[System.Net.IPAddress]::Parse($($ip -join '.'))
}
}
$ipRange = New-Range $StartAddress $EndAddress
$IpTotal = $ipRange.Count
Get-Event -SourceIdentifier "ID-Ping*" | Remove-Event
Get-EventSubscriber -SourceIdentifier "ID-Ping*" | Unregister-Event
$ipRange | ForEach-Object {
[string]$VarName = "Ping_" + $_.Address
New-Variable -Name $VarName -Value (New-Object System.Net.NetworkInformation.Ping)
Register-ObjectEvent -InputObject (Get-Variable $VarName -ValueOnly) -EventName PingCompleted -SourceIdentifier "ID-$VarName"
(Get-Variable $VarName -ValueOnly).SendAsync($_,$timeout,$VarName)
Remove-Variable $VarName
try
{
$pending = (Get-Event -SourceIdentifier "ID-Ping*").Count
}
catch [System.InvalidOperationException]
{
$pending = 0
}
$index = [array]::indexof($ipRange,$_)
Write-Progress -Activity "Sending ping to" -Id 1 -status $_.IPAddressToString -PercentComplete (($index / $IpTotal) * 100)
$percentComplete = ($($index - $pending), 0 | Measure-Object -Maximum).Maximum
Write-Progress -Activity "ICMP requests pending" -Id 2 -ParentId 1 -Status ($index - $pending) -PercentComplete ($percentComplete/$IpTotal * 100)
Start-Sleep -Milliseconds $Interval
}
Write-Progress -Activity "Done sending ping requests" -Id 1 -Status 'Waiting' -PercentComplete 100
while ($pending -lt $IpTotal) {
Wait-Event -SourceIdentifier "ID-Ping*" | Out-Null
Start-Sleep -Milliseconds 10
try
{
$pending = (Get-Event -SourceIdentifier "ID-Ping*").Count
}
catch [System.InvalidOperationException]
{
$pending = 0
}
$percentComplete = ($($IpTotal - $pending), 0 | Measure-Object -Maximum).Maximum
Write-Progress -Activity "ICMP requests pending" -Id 2 -ParentId 1 -Status ($IpTotal - $pending) -PercentComplete ($percentComplete/$IpTotal * 100)
}
Write-Progress -Completed -Id 2 -ParentId 1 -Activity "Completed"
Write-Progress -Completed -Id 1 -Activity "Completed"
$Reply = #()
if ($RawOutput)
{
Get-Event -SourceIdentifier "ID-Ping*" | ForEach {
if ($_.SourceEventArgs.Reply.Status -eq "Success")
{
$Reply += $_.SourceEventArgs.Reply
}
Unregister-Event $_.SourceIdentifier
Remove-Event $_.SourceIdentifier
}
}
else
{
Get-Event -SourceIdentifier "ID-Ping*" | ForEach-Object {
if ($_.SourceEventArgs.Reply.Status -eq "Success")
{
$pinger = #{
IPAddress = $_.SourceEventArgs.Reply.Address
Bytes = $_.SourceEventArgs.Reply.Buffer.Length
Ttl = $_.SourceEventArgs.Reply.Options.Ttl
ResponseTime = $_.SourceEventArgs.Reply.RoundtripTime
}
$Reply += New-Object PSObject -Property $pinger
}
Unregister-Event $_.SourceIdentifier
Remove-Event $_.SourceIdentifier
}
}
if ($Reply.Count -eq 0)
{
Write-Verbose "Ping-IPRange : No IP address responded" -Verbose
}
return $Reply
}
So im trying to make a backup script that will download a csv from my mssql, then zip the file, then upload the backup to amazon S3.
The issue im having is the table is 20Million lines on average when i run the script daily. and it looks like it just lags forever untill it completes like 20 minutes later. I was wondering if there is a way to show a progress bar for the invoke-sqlcmd specificly. ive done some research and all the examples i could find is to make a progress bar on a for loop only, not for a single commands progress.
Here is my code:
ECHO "Starting Download"
Import-Module sqlps
#$SQLquery="SELECT * FROM dbo.$PREFIX$i"
$SQLquery="SELECT * FROM dbo.events"
ECHO "Executing query = $SQLquery"
$hostname = "."
$pass = "test"
$usern = "test"
$database = "theDB"
$result=invoke-sqlcmd -ServerInstance $hostname -query $SQLquery -HostName $hostname -Password $pass -Username $usern -Database $database -verbose
#echo $result
pause
$result |export-csv -path $CSVPATH -notypeinformation
pause
ECHO "Starting Zip:"
Compress-Archive -LiteralPath $CSVPATH -CompressionLevel Optimal -DestinationPath $ZIPPATH
ECHO "Starting Delete: $CSVPATH "
del "$CSVPATH"
echo "Removed $CSVNAME"
aws s3 cp $ZIPPATH s3://test_$ZIPNAME
pause
this script works but as i said i would like to add a progress bar to the invoke-sqlcmd so that it doesnt look like its frozen while it downloads the huge file.
this is what i could find so far but this only works for a loops progression
$VerbosePreference = "Continue"
Write-Verbose "Test Message"
for ($a=1; $a -lt 100; $a++) {
Write-Progress -Activity "Working..." -PercentComplete $a -CurrentOperation "$a% complete" -Status "Please wait."
Start-Sleep -Milliseconds 100
}
Considering your huge ~20 million record data set, it's probably a good idea to use some of the .NET classes in the System.Data.Common namespace. And I'm not sure about how Export-Csv is implemented, but System.IO.StreamWriter is very efficient for writing large files.
A simple tested/working example with inline comments:
# replace $tableName with yours
$sqlCount = "SELECT COUNT(*) FROM dbo.$($tableName)";
$sqlSelect = "SELECT * FROM dbo.$($tableName)";
$provider = [System.Data.Common.DbProviderFactories]::GetFactory('System.Data.SqlClient');
$connection = $provider.CreateConnection();
# replace $connectionString with yours, e.g.:
# "Data Source=$($INSTANCE-NAME);Initial Catalog=$($DATABASE-NAME);Integrated Security=True;";
$connection.ConnectionString = $connectionString;
$command = $connection.CreateCommand();
# get total record count for Write-Progress
$command.CommandText = $sqlCount;
$connection.Open();
$reader = $command.ExecuteReader();
$totalRecords = 0;
while ($reader.Read()) {
$totalRecords = $reader[0];
}
$reader.Dispose();
# select CSV data
$command.CommandText = $sqlSelect;
$reader = $command.ExecuteReader();
# get CSV field names
$columnNames = #();
for ($i = 0; $i -lt $reader.FieldCount; $i++) {
$columnNames += $reader.GetName($i);
}
# read and populate data one row at a time
$values = New-Object object[] $columnNames.Length;
$currentCount = 0;
# replace $CSVPATH with yours
$writer = New-Object System.IO.StreamWriter($CSVPATH);
$writer.WriteLine(($columnNames -join ','));
while ($reader.Read()) {
$null = $reader.GetValues($values);
$writer.WriteLine(($values -join ','));
if (++$currentCount % 1000 -eq 0) {
Write-Progress -Activity 'Reading data' `
-Status "Finished reading $currentCount out of $totalRecords records." `
-PercentComplete ($currentCount / $totalRecords * 100);
}
}
$command.Dispose();
$reader.Dispose();
$connection.Dispose();
$writer.Dispose();
So, I have a script which shows Download Progress from FTP.
I just try many ways to solve this task.
One of the conclusions was that cmdlet Register-ObjectEvent is a really bad idea.. Async eventing is rather poorly supported in Powershell...
And I stopped there -
$webclient.add_DownloadProgressChanged([System.Net.DownloadProgressChangedEventHandler]$webclient_DownloadProgressChanged )
....
$webclient_DownloadProgressChanged = {
param([System.Net.DownloadProgressChangedEventArgs]$Global:e)
$progressbaroverlay1.value=$e.ProgressPercentage
....
}
And everything in this sript works fine, but you can understand that I did this was for a one file.
But then I started thinking - How I can download several files at the same time and show it in a one progress bar?
So anyone have any great ideas? Or best way to solve this task?
P.S
WebClient can only download one file at a time.
Of course, I know it.
I came up with the same kind of Scriptblock Create approach, but I did use Register-ObjectEvent, with more or less success. The Async downloads happen as background jobs, and use events to communicate their progress back to the main script.
$Progress = #{}
$Isos = 'https://cdimage.debian.org/debian-cd/current/i386/iso-cd/debian-8.8.0-i386-CD-1.iso',
'https://cdimage.debian.org/debian-cd/current/i386/iso-cd/debian-8.8.0-i386-CD-2.iso'
$Count = 0
$WebClients = $Isos | ForEach-Object {
$w = New-Object System.Net.WebClient
$null = Register-ObjectEvent -InputObject $w -EventName DownloadProgressChanged -Action ([scriptblock]::Create(
"`$Percent = 100 * `$eventargs.BytesReceived / `$eventargs.TotalBytesToReceive; `$null = New-Event -SourceIdentifier MyDownloadUpdate -MessageData #($count,`$Percent)"
))
$w.DownloadFileAsync($_, "C:\PATH_TO_DOWNLOAD_FOLDER\$count.iso")
$Count = $Count + 1
$w
}
$event = Register-EngineEvent -SourceIdentifier MyDownloadUpdate -Action {
$progress[$event.MessageData[0]] = $event.MessageData[1]
}
$Timer = New-Object System.Timers.Timer
Register-ObjectEvent -InputObject $Timer -EventName Elapsed -Action {
if ($Progress.Values.Count -gt 0)
{
$PercentComplete = 100 * ($Progress.values | Measure-Object -Sum | Select-Object -ExpandProperty Sum) / $Progress.Values.Count
Write-Progress -Activity "Download Progress" -PercentComplete $PercentComplete
}
}
$timer.Interval = 100
$timer.AutoReset = $true
$timer.Start()
Exercise for the reader for how to tell that the downloads have finished.
You can use BitsTransfer module Asynchronous download.
https://technet.microsoft.com/en-us/library/dd819420.aspx
Example code to show overall process of 3 files, you specify an equal array of urls and download locations, you can do further with that to your liking like exception handling etc:
Import-Module BitsTransfer
[string[]]$url = #();
$url += 'https://www.samba.org/ftp/talloc/talloc-2.1.6.tar.gz';
$url += 'https://e.thumbs.redditmedia.com/pF525auqxnTG-FFj.png';
$url += 'http://bchavez.bitarmory.com/Skins/DayDreaming/images/bg-header.gif';
[string[]]$destination = #();
$destination += 'C:\Downloads\talloc-2.1.6.tar.gz';
$destination += 'C:\Downloads\pF525auqxnTG-FFj.png';
$destination += 'C:\Downloads\bg-header.gif';
$result = Start-BitsTransfer -Source $url -Destination $destination -TransferType Download -Asynchronous
$downloadsFinished = $false;
While ($downloadsFinished -ne $true) {
sleep 1
$jobstate = $result.JobState;
if($jobstate.ToString() -eq 'Transferred') { $downloadsFinished = $true }
$percentComplete = ($result.BytesTransferred / $result.BytesTotal) * 100
Write-Progress -Activity ('Downloading' + $result.FilesTotal + ' files') -PercentComplete $percentComplete
}
I see two possible concepts for this:
Create (with [scriptblock]::Create) an anonymous function on the fly, some something like:
$Id = 0
... | ForEach {
$webclient[$Id].add_DownloadProgressChanged([System.Net.DownloadProgressChangedEventHandler]{[scriptblock]::Create("
....
`$webclient_DownloadProgressChanged = {
param([System.Net.DownloadProgressChangedEventArgs]`$e)
`$Global:ProgressPercentage[$Id]=`$e.ProgressPercentage
`$progressbaroverlay1.value=(`$Global:ProgressPercentage | Measure-Object -Average).Average
....
"
$Id++
})
}
Note that in this idea you need to prevent everything but the $Id to be directly interpreted with a backtick.
Or if the function gets too large to be read, simplify the [ScriptBlock]:
[ScriptBlock]::Create("param(`$e); webclient_DownloadProgressChanged $Id `$e")
and call a global function:
$Global:webclient_DownloadProgressChanged($Id, $e) {
$Global:ProgressPercentage[$Id]=$e.ProgressPercentage
$progressbaroverlay1.value=($Global:ProgressPercentage | Measure-Object -Average).Average
}
Create your own custom background workers (threads):
For an example see: PowerShell: Job Event Action with Form not executed
In the main thread build your UI with progress bars
For each FTP download:
Create a shared (hidden) windows control (e.g. .TextBox[$Id])
Start a new background worker and share the related control, something like:
$SyncHash = [hashtable]::Synchronized(#{TextBox =
$TextBox[$Id]})
Update the shared $SyncHash.TextBox.Text = from within the WebWorker(s)
Capture the events (e.g. .Add_TextChanged) on each .TextBox[$Id] in
the main thread
Update your progress bars accordingly based the average status passed in each .TextBox[$Id].Text
Is there any way to copy a really large file (from one server to another) in PowerShell AND display its progress?
There are solutions out there to use Write-Progress in conjunction with looping to copy many files and display progress. However I can't seem to find anything that would show progress of a single file.
Any thoughts?
It seems like a much better solution to just use BitsTransfer, it seems to come OOTB on most Windows machines with PowerShell 2.0 or greater.
Import-Module BitsTransfer
Start-BitsTransfer -Source $Source -Destination $Destination -Description "Backup" -DisplayName "Backup"
I haven't heard about progress with Copy-Item. If you don't want to use any external tool, you can experiment with streams. The size of buffer varies, you may try different values (from 2kb to 64kb).
function Copy-File {
param( [string]$from, [string]$to)
$ffile = [io.file]::OpenRead($from)
$tofile = [io.file]::OpenWrite($to)
Write-Progress -Activity "Copying file" -status "$from -> $to" -PercentComplete 0
try {
[byte[]]$buff = new-object byte[] 4096
[long]$total = [int]$count = 0
do {
$count = $ffile.Read($buff, 0, $buff.Length)
$tofile.Write($buff, 0, $count)
$total += $count
if ($total % 1mb -eq 0) {
Write-Progress -Activity "Copying file" -status "$from -> $to" `
-PercentComplete ([long]($total * 100 / $ffile.Length))
}
} while ($count -gt 0)
}
finally {
$ffile.Dispose()
$tofile.Dispose()
Write-Progress -Activity "Copying file" -Status "Ready" -Completed
}
}
Alternativly this option uses the native windows progress bar...
$FOF_CREATEPROGRESSDLG = "&H0&"
$objShell = New-Object -ComObject "Shell.Application"
$objFolder = $objShell.NameSpace($DestLocation)
$objFolder.CopyHere($srcFile, $FOF_CREATEPROGRESSDLG)
cmd /c copy /z src dest
not pure PowerShell, but executable in PowerShell and it displays progress in percents
I amended the code from stej (which was great, just what i needed!) to use larger buffer, [long] for larger files and used System.Diagnostics.Stopwatch class to track elapsed time and estimate time remaining.
Also added reporting of transfer rate during transfer and outputting overall elapsed time and overall transfer rate.
Using 4MB (4096*1024 bytes) buffer to get better than Win7 native throughput copying from NAS to USB stick on laptop over wifi.
On To-Do list:
add error handling (catch)
handle get-childitem file list as input
nested progress bars when copying multiple files (file x of y, % if
total data copied etc)
input parameter for buffer size
Feel free to use/improve :-)
function Copy-File {
param( [string]$from, [string]$to)
$ffile = [io.file]::OpenRead($from)
$tofile = [io.file]::OpenWrite($to)
Write-Progress `
-Activity "Copying file" `
-status ($from.Split("\")|select -last 1) `
-PercentComplete 0
try {
$sw = [System.Diagnostics.Stopwatch]::StartNew();
[byte[]]$buff = new-object byte[] (4096*1024)
[long]$total = [long]$count = 0
do {
$count = $ffile.Read($buff, 0, $buff.Length)
$tofile.Write($buff, 0, $count)
$total += $count
[int]$pctcomp = ([int]($total/$ffile.Length* 100));
[int]$secselapsed = [int]($sw.elapsedmilliseconds.ToString())/1000;
if ( $secselapsed -ne 0 ) {
[single]$xferrate = (($total/$secselapsed)/1mb);
} else {
[single]$xferrate = 0.0
}
if ($total % 1mb -eq 0) {
if($pctcomp -gt 0)`
{[int]$secsleft = ((($secselapsed/$pctcomp)* 100)-$secselapsed);
} else {
[int]$secsleft = 0};
Write-Progress `
-Activity ($pctcomp.ToString() + "% Copying file # " + "{0:n2}" -f $xferrate + " MB/s")`
-status ($from.Split("\")|select -last 1) `
-PercentComplete $pctcomp `
-SecondsRemaining $secsleft;
}
} while ($count -gt 0)
$sw.Stop();
$sw.Reset();
}
finally {
write-host (($from.Split("\")|select -last 1) + `
" copied in " + $secselapsed + " seconds at " + `
"{0:n2}" -f [int](($ffile.length/$secselapsed)/1mb) + " MB/s.");
$ffile.Close();
$tofile.Close();
}
}
Not that I'm aware of. I wouldn't recommend using copy-item for this anyway. I don't think it has been designed to be robust like robocopy.exe to support retry which you would want for extremely large file copies over the network.
i found none of the examples above met my needs, i wanted to copy a directory with sub directories, the problem is my source directory had too many files so i quickly hit the BITS file limit (i had > 1500 file) also the total directory size was quite large.
i found a function using robocopy that was a good starting point at https://keithga.wordpress.com/2014/06/23/copy-itemwithprogress/, however i found it wasn't quite robust enough, it didn't handle trailing slashes, spaces gracefully and did not stop the copy when the script was halted.
Here is my refined version:
function Copy-ItemWithProgress
{
<#
.SYNOPSIS
RoboCopy with PowerShell progress.
.DESCRIPTION
Performs file copy with RoboCopy. Output from RoboCopy is captured,
parsed, and returned as Powershell native status and progress.
.PARAMETER Source
Directory to copy files from, this should not contain trailing slashes
.PARAMETER Destination
DIrectory to copy files to, this should not contain trailing slahes
.PARAMETER FilesToCopy
A wildcard expresion of which files to copy, defaults to *.*
.PARAMETER RobocopyArgs
List of arguments passed directly to Robocopy.
Must not conflict with defaults: /ndl /TEE /Bytes /NC /nfl /Log
.PARAMETER ProgressID
When specified (>=0) will use this identifier for the progress bar
.PARAMETER ParentProgressID
When specified (>= 0) will use this identifier as the parent ID for progress bars
so that they appear nested which allows for usage in more complex scripts.
.OUTPUTS
Returns an object with the status of final copy.
REMINDER: Any error level below 8 can be considered a success by RoboCopy.
.EXAMPLE
C:\PS> .\Copy-ItemWithProgress c:\Src d:\Dest
Copy the contents of the c:\Src directory to a directory d:\Dest
Without the /e or /mir switch, only files from the root of c:\src are copied.
.EXAMPLE
C:\PS> .\Copy-ItemWithProgress '"c:\Src Files"' d:\Dest /mir /xf *.log -Verbose
Copy the contents of the 'c:\Name with Space' directory to a directory d:\Dest
/mir and /XF parameters are passed to robocopy, and script is run verbose
.LINK
https://keithga.wordpress.com/2014/06/23/copy-itemwithprogress
.NOTES
By Keith S. Garner (KeithGa#KeithGa.com) - 6/23/2014
With inspiration by Trevor Sullivan #pcgeek86
Tweaked by Justin Marshall - 02/20/2020
#>
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)]
[string]$Source,
[Parameter(Mandatory=$true)]
[string]$Destination,
[Parameter(Mandatory=$false)]
[string]$FilesToCopy="*.*",
[Parameter(Mandatory = $true,ValueFromRemainingArguments=$true)]
[string[]] $RobocopyArgs,
[int]$ParentProgressID=-1,
[int]$ProgressID=-1
)
#handle spaces and trailing slashes
$SourceDir = '"{0}"' -f ($Source -replace "\\+$","")
$TargetDir = '"{0}"' -f ($Destination -replace "\\+$","")
$ScanLog = [IO.Path]::GetTempFileName()
$RoboLog = [IO.Path]::GetTempFileName()
$ScanArgs = #($SourceDir,$TargetDir,$FilesToCopy) + $RobocopyArgs + "/ndl /TEE /bytes /Log:$ScanLog /nfl /L".Split(" ")
$RoboArgs = #($SourceDir,$TargetDir,$FilesToCopy) + $RobocopyArgs + "/ndl /TEE /bytes /Log:$RoboLog /NC".Split(" ")
# Launch Robocopy Processes
write-verbose ("Robocopy Scan:`n" + ($ScanArgs -join " "))
write-verbose ("Robocopy Full:`n" + ($RoboArgs -join " "))
$ScanRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $ScanArgs
try
{
$RoboRun = start-process robocopy -PassThru -WindowStyle Hidden -ArgumentList $RoboArgs
try
{
# Parse Robocopy "Scan" pass
$ScanRun.WaitForExit()
$LogData = get-content $ScanLog
if ($ScanRun.ExitCode -ge 8)
{
$LogData|out-string|Write-Error
throw "Robocopy $($ScanRun.ExitCode)"
}
$FileSize = [regex]::Match($LogData[-4],".+:\s+(\d+)\s+(\d+)").Groups[2].Value
write-verbose ("Robocopy Bytes: $FileSize `n" +($LogData -join "`n"))
#determine progress parameters
$ProgressParms=#{}
if ($ParentProgressID -ge 0) {
$ProgressParms['ParentID']=$ParentProgressID
}
if ($ProgressID -ge 0) {
$ProgressParms['ID']=$ProgressID
} else {
$ProgressParms['ID']=$RoboRun.Id
}
# Monitor Full RoboCopy
while (!$RoboRun.HasExited)
{
$LogData = get-content $RoboLog
$Files = $LogData -match "^\s*(\d+)\s+(\S+)"
if ($null -ne $Files )
{
$copied = ($Files[0..($Files.Length-2)] | ForEach-Object {$_.Split("`t")[-2]} | Measure-Object -sum).Sum
if ($LogData[-1] -match "(100|\d?\d\.\d)\%")
{
write-progress Copy -ParentID $ProgressParms['ID'] -percentComplete $LogData[-1].Trim("% `t") $LogData[-1]
$Copied += $Files[-1].Split("`t")[-2] /100 * ($LogData[-1].Trim("% `t"))
}
else
{
write-progress Copy -ParentID $ProgressParms['ID'] -Complete
}
write-progress ROBOCOPY -PercentComplete ($Copied/$FileSize*100) $Files[-1].Split("`t")[-1] #ProgressParms
}
}
} finally {
if (!$RoboRun.HasExited) {Write-Warning "Terminating copy process with ID $($RoboRun.Id)..."; $RoboRun.Kill() ; }
$RoboRun.WaitForExit()
# Parse full RoboCopy pass results, and cleanup
(get-content $RoboLog)[-11..-2] | out-string | Write-Verbose
remove-item $RoboLog
write-output ([PSCustomObject]#{ ExitCode = $RoboRun.ExitCode })
}
} finally {
if (!$ScanRun.HasExited) {Write-Warning "Terminating scan process with ID $($ScanRun.Id)..."; $ScanRun.Kill() }
$ScanRun.WaitForExit()
remove-item $ScanLog
}
}
Hate to be the one to bump an old subject, but I found this post extremely useful. After running performance tests on the snippets by stej and it's refinement by Graham Gold, plus the BITS suggestion by Nacht, I have decuded that:
I really liked Graham's command with time estimations and speed readings.
I also really liked the significant speed increase of using BITS as my transfer method.
Faced with the decision between the two... I found that Start-BitsTransfer supported Asynchronous mode. So here is the result of my merging the two.
function Copy-File {
# ref: https://stackoverflow.com/a/55527732/3626361
param([string]$From, [string]$To)
try {
$job = Start-BitsTransfer -Source $From -Destination $To `
-Description "Moving: $From => $To" `
-DisplayName "Backup" -Asynchronous
# Start stopwatch
$sw = [System.Diagnostics.Stopwatch]::StartNew()
Write-Progress -Activity "Connecting..."
while ($job.JobState.ToString() -ne "Transferred") {
switch ($job.JobState.ToString()) {
"Connecting" {
break
}
"Transferring" {
$pctcomp = ($job.BytesTransferred / $job.BytesTotal) * 100
$elapsed = ($sw.elapsedmilliseconds.ToString()) / 1000
if ($elapsed -eq 0) {
$xferrate = 0.0
}
else {
$xferrate = (($job.BytesTransferred / $elapsed) / 1mb);
}
if ($job.BytesTransferred % 1mb -eq 0) {
if ($pctcomp -gt 0) {
$secsleft = ((($elapsed / $pctcomp) * 100) - $elapsed)
}
else {
$secsleft = 0
}
Write-Progress -Activity ("Copying file '" + ($From.Split("\") | Select-Object -last 1) + "' # " + "{0:n2}" -f $xferrate + "MB/s") `
-PercentComplete $pctcomp `
-SecondsRemaining $secsleft
}
break
}
"Transferred" {
break
}
Default {
throw $job.JobState.ToString() + " unexpected BITS state."
}
}
}
$sw.Stop()
$sw.Reset()
}
finally {
Complete-BitsTransfer -BitsJob $job
Write-Progress -Activity "Completed" -Completed
}
}
This recursive function copies files and directories recursively from source path to destination path
If file already exists on destination path, it copies them only with newer files.
Function Copy-FilesBitsTransfer(
[Parameter(Mandatory=$true)][String]$sourcePath,
[Parameter(Mandatory=$true)][String]$destinationPath,
[Parameter(Mandatory=$false)][bool]$createRootDirectory = $true)
{
$item = Get-Item $sourcePath
$itemName = Split-Path $sourcePath -leaf
if (!$item.PSIsContainer){ #Item Is a file
$clientFileTime = Get-Item $sourcePath | select LastWriteTime -ExpandProperty LastWriteTime
if (!(Test-Path -Path $destinationPath\$itemName)){
Start-BitsTransfer -Source $sourcePath -Destination $destinationPath -Description "$sourcePath >> $destinationPath" -DisplayName "Copy Template file" -Confirm:$false
if (!$?){
return $false
}
}
else{
$serverFileTime = Get-Item $destinationPath\$itemName | select LastWriteTime -ExpandProperty LastWriteTime
if ($serverFileTime -lt $clientFileTime)
{
Start-BitsTransfer -Source $sourcePath -Destination $destinationPath -Description "$sourcePath >> $destinationPath" -DisplayName "Copy Template file" -Confirm:$false
if (!$?){
return $false
}
}
}
}
else{ #Item Is a directory
if ($createRootDirectory){
$destinationPath = "$destinationPath\$itemName"
if (!(Test-Path -Path $destinationPath -PathType Container)){
if (Test-Path -Path $destinationPath -PathType Leaf){ #In case item is a file, delete it.
Remove-Item -Path $destinationPath
}
New-Item -ItemType Directory $destinationPath | Out-Null
if (!$?){
return $false
}
}
}
Foreach ($fileOrDirectory in (Get-Item -Path "$sourcePath\*"))
{
$status = Copy-FilesBitsTransfer $fileOrDirectory $destinationPath $true
if (!$status){
return $false
}
}
}
return $true
}
Sean Kearney from the Hey, Scripting Guy! Blog has a solution I found works pretty nicely.
Function Copy-WithProgress
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$Source,
[Parameter(Mandatory=$true,
ValueFromPipelineByPropertyName=$true,
Position=0)]
$Destination
)
$Source=$Source.tolower()
$Filelist=Get-Childitem "$Source" –Recurse
$Total=$Filelist.count
$Position=0
foreach ($File in $Filelist)
{
$Filename=$File.Fullname.tolower().replace($Source,'')
$DestinationFile=($Destination+$Filename)
Write-Progress -Activity "Copying data from '$source' to '$Destination'" -Status "Copying File $Filename" -PercentComplete (($Position/$total)*100)
Copy-Item $File.FullName -Destination $DestinationFile
$Position++
}
}
Then to use it:
Copy-WithProgress -Source $src -Destination $dest
Trevor Sullivan has a write-up on how to add a command called Copy-ItemWithProgress to PowerShell on Robocopy.