As the title suggests, how do I do an if statement for comparing 2 files? This is what I have so far but even though the files are the same it still keeps prompting me for an update
Add-Type -AssemblyName 'PresentationFramework'
$continue = 'Yes'
Invoke-WebRequest -Uri "https://<url>/file" -OutFile "C:\temp\file"
if ( "C:\temp\file" -ne "C:\app\file" ) {
$caption = "Update Available!"
$message = "There is a new version available. Do you want to update to the latest version?"
$continue = [System.Windows.MessageBox]::Show($message, $caption, 'YesNo');
} else {
Write-Output "You have the latest version!"
}
if ($continue -eq 'Yes') {
Write-Output "Downloading application Please wait..."
// Do something //
} else {
Write-Output "Cancelled"
}
Any help would be greatly appreciated! Thanks
This is the logic I would personally follow, you can re-estructure it later for your need but in very basic steps:
Get the MD5 sum of my current file.
Query the new file with Invoke-WebRequest and using a MemoryStream check if the new hash is the same as the previous one.
If it's not, write that stream to a FileStream.
$ErrorActionPreference = 'Stop'
try {
$previousFile = "C:\app\file"
$previousHash = (Get-FileHash $previousFile -Algorithm MD5).Hash
$webReq = Invoke-WebRequest 'https://<url>/file'
$memStream = [IO.MemoryStream] $webReq.Content
$newHash = (Get-FileHash -InputStream $memStream -Algorithm MD5).Hash
if($previousHash -ne $newHash) {
$file = [IO.File]::Create('path\to\mynewfile.ext')
$memStream.WriteTo($file)
}
}
finally {
$file, $memStream | ForEach-Object Dispose
}
The main idea behind it is to check all in memory so as to avoid the need to download a file if not needed.
Related
I have a scenario where I am doing any number of things with a network resource, copying files or folders, deleting a ZIP file after it has been unzipped locally, checking an EXE to see if it was downloaded from the internet and needs unblocked, running a software install, etc. And all of these tasks are impacted by things like an installer that hasn't released a file lock on a log file to be deleted, or the unzip hasn't released the file lock on the zip file, or the network is for a fleeting moment "unavailable" so a network file is not found.
I have a technique that works well for handling this scenario, I do a loop and react to the specific exception, and when that exception occurs I just sleep for 6 seconds. The loop happens up to 10 times before I abandon. Here is the code for dealing with an Autodesk log that is still locked by the Autodesk installer.
$waitRetryCount = 10
:waitForAccess do {
try {
Remove-Item $odisLogPath -Recurse -errorAction:Stop
$retry = $False
} catch {
if (($PSItem.Exception -is [System.IO.IOException]) -or ($PSItem.Exception.InnerException -and ($PSItem.Exception.InnerException -is [System.IO.IOException]))) {
if ($waitRetryCount -eq 0) {
$invokeODISLogManagement.log.Add('E_Error deleting ODIS logs: retry limit exceeded')
break waitForFolderAccess
} else {
$retry = $True
$waitRetryCount --
Start-Sleep -s:6
}
} else {
$invokeODISLogManagement.log.Add('E_Error deleting ODIS logs')
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.GetType().FullName)")
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.Message)")
if ($PSItem.Exception.InnerException) {
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.InnerException.GetType().FullName)")
$invokeODISLogManagement.log.Add("=_$($PSItem.Exception.InnerException.Message)")
}
$retry = $False
}
}
} while ($retry)
The thing is, I would like to convert this to a function, since it needs to be handled in a lot of places. So I would need to pass to the function the specific exception I am looking for and the code to be run in the try block, and get back a log (as a generic.list) that I can then add to the actual log list. The first and last aspects I have, but I am unsure the best approach for the code to try. In the above example it is a single line, Remove-Item $odisLogPath -Recurse -errorAction:Stop, but it could be multiple lines I suspect.
To start playing with this I verified that this does seem to work, at least with a single line of code.
$code = {Get-Item '\\noServer\folder\file.txt' -errorAction:Stop}
try {
& $code
} catch {
Write-Host "$($_.Exception.GetType().FullName)"
}
But the error action is going to be duplicated a lot, so I thought to maybe address that within the function, however
$code = {Get-Item '\noServer\folder\file.txt'}
try {
& $code -errorAction:Stop
} catch {
Write-Host "$($_.Exception.GetType().FullName)"
}
does NOT work. I get the exception uncaught.
So, my questions are
1: Is this the right direction? I am pretty sure it is but perhaps someone has a gotcha that I am not seeing, yet. :)
2: Is there a mechanism to add the -errorAction:Stop in the try, so I don't need to do it/remember to do it, at every use of this new function.
3: I seem to remember reading about a programming concept of passing code to a function, and I can't remember what that is called, but I would like to know the generic term. Indeed, it probably would help if I could tag it for this post. I had thought it might be lama, but a quick search suggests that is not the case? Being self taught sucks sometimes.
EDIT:
I have now implemented a function, that starts to do what I want.
function Invoke-PxWaitForAccess {
param (
[System.Management.Automation.ScriptBlock]$code,
[String]$path
)
try {
(& $code -path $path)
} catch {
Return "$($_.Exception.GetType().FullName)!!"
}
}
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Write-Host "$path!"; Get-Item $path}
Invoke-PxWaitForAccess -code $code -path $path
I do wonder if the path couldn't somehow be encapsulated in the $code variable itself, since this implementation means it can ONLY be used where the code being run has a single variable called $path.
And, still wondering if this really is the best, or even a good, way to proceed? Or are there arguments for just implementing my loop 50 some odd times in all the situations where I need this behavior.
Also worth noting that this code does not yet implement the loop or address the fact that different exceptions apply in different situations.
EDIT #2:
And here is a more complete implementation, though it fails because it seems I am not actually passing a type, even though it looks like I am. So I get an error because what is to the right of -is must be an actual type.
function Invoke-PxWaitForAccess {
param (
[System.Management.Automation.ScriptBlock]$code,
[String]$path,
[Type]$exceptionType
)
$invokeWaitForAccess = #{
success = $Null
log = [System.Collections.Generic.List[String]]::new()
}
$waitRetryCount = 2
:waitForAccess do {
try {
Write-Host "$path ($waitRetryCount)"
& $code -path $path
$retry = $False
} catch {
Write-Host "!$($PSItem.Exception.GetType().FullName)"
if (($PSItem.Exception -is $exceptionType) -or ($PSItem.Exception.InnerException -and ($PSItem.Exception.InnerException -is $exceptionType))) {
Write-Host "($waitRetryCount)"
if ($waitRetryCount -eq 0) {
$invokeWaitForAccess.log.Add('E_retry limit exceeded')
break waitForFolderAccess
} else {
$retry = $True
$waitRetryCount --
Start-Sleep -s:6
}
} else {
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.GetType().FullName)")
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.Message)")
if ($PSItem.Exception.InnerException) {
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.InnerException.GetType().FullName)")
$invokeWaitForAccess.log.Add("=_$($PSItem.Exception.InnerException.Message)")
}
$retry = $False
}
}
} while ($retry)
if ($invokeWaitForAccess.log.Count -eq 0) {
$invokeWaitForAccess.success = $True
} else {
$invokeWaitForAccess.success = $False
}
return $invokeWaitForAccess
}
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Get-Item $path -errorAction:Stop}
if ($invoke = (Invoke-PxWaitForAccess -code $code -path $path -type ([System.Management.Automation.ItemNotFoundException])).success) {
Write-Host 'Good'
} else {
foreach ($line in $invoke.log) {
Write-Host "$line"
}
}
EDIT #3: This is what I have now, and it seems to work fine. But the code I am passing will sometimes be something like Remove-Object and the error is [System.IO.IOException], but at other times I actually need to return more than an error, like here where the code involves Get-Item. And that means defining the code block outside the function with a reference to the variable inside the function, which seems, fugly, to me. It may be that what I am trying to do is just more complicated than PowerShell is really designed to handle, but it seems MUCH more likely that there is a more elegant way to do what I am trying to do? Without being able to manipulate the script block from within the function I don't see any good options.
For what it is worth this last example shows a failure where the exception I am accepting for the repeat occurs and hits the limit, as well as an exception that just immediately fails because it is not the exception I am looping on, and an example where I return something. A fourth condition would be when I am trying to delete, and waiting on [System.IO.IOException] and a success would return nothing, no item, and no error log.
function Invoke-PxWaitForAccess {
param (
[System.Management.Automation.ScriptBlock]$code,
[String]$path,
[Type]$exceptionType
)
$invokeWaitForAccess = #{
item = $null
errorLog = [System.Collections.Generic.List[String]]::new()
}
$waitRetryCount = 2
:waitForSuccess do {
try {
& $code -path $path
$retry = $False
} catch {
if (($PSItem.Exception -is $exceptionType) -or ($PSItem.Exception.InnerException -and ($PSItem.Exception.InnerException -is $exceptionType))) {
if ($waitRetryCount -eq 0) {
$invokeWaitForAccess.errorLog.Add('E_Retry limit exceeded')
break waitForSuccess
} else {
$retry = $True
$waitRetryCount --
Start-Sleep -s:6
}
} else {
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.GetType().FullName)")
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.Message)")
if ($PSItem.Exception.InnerException) {
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.InnerException.GetType().FullName)")
$invokeWaitForAccess.errorLog.Add("=_$($PSItem.Exception.InnerException.Message)")
}
$retry = $False
}
}
} while ($retry)
return $invokeWaitForAccess
}
CLS
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Get-Item $path -errorAction:Stop}
$invoke = (Invoke-PxWaitForAccess -code $code -path $path -exceptionType:([System.Management.Automation.ItemNotFoundException]))
if ($invoke.errorLog.count -eq 0) {
Write-Host "Good $path"
} else {
foreach ($line in $invoke.errorLog) {
Write-Host "$line"
}
}
Write-Host
$path = '\\noServer\folder\file.txt'
$code = {param ([String]$path) Get-Item $path -errorAction:Stop}
$invoke = (Invoke-PxWaitForAccess -code $code -path $path -exceptionType:([System.IO.IOException]))
if ($invoke.errorLog.count -eq 0) {
Write-Host "Good $path"
} else {
foreach ($line in $invoke.errorLog) {
Write-Host "$line"
}
}
Write-Host
$path = '\\Mac\iCloud Drive\Px Tools 3.#\# Dev 3.4.5\Definitions.xml'
$code = {param ([String]$path) $invokeWaitForAccess.item = Get-Item $path -errorAction:Stop}
$invoke = (Invoke-PxWaitForAccess -code $code -path $path -exceptionType:([System.Management.Automation.ItemNotFoundException]))
if ($invoke.errorLog.count -eq 0) {
Write-Host "Good $path !"
Write-Host "$($invoke.item)"
} else {
foreach ($line in $invoke.errorLog) {
Write-Host "$line"
}
}
Write-Host
I'm doing a BITS transfer of daily imagery from a web server and I keep getting random drops during the transfer.
As it's cycling through the downloads I get the occasional "The connection was closed prematurely" or "An error occurred in the secure channel support". There are about 180 images in each folder and this happens for maybe 5-10% of them. I need to retry the download for those that didn't complete.
My code to do so follows - my imperfect work-around is to run the loop twice but I'm hoping to find a better solution.
# Set the URL where the images are located
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
# Set the local path where the images will be stored
$path = 'C:\images\Wind_Waves\latest\'
# Create a list of all assets returned from $url
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links | Where-Object{ $_.tagName -eq 'A' -and $_.href.ToLower().EndsWith("jpg") }
# Create a list of all href items from the table & call it $images
$images = $table.href
# Enumerate all of the images - for troubleshooting purposes - can be removed
$images
# Check to make sure there are images available for download - arbitrarily picked more than 2 $images
if($images.count -gt 2){
# Delete all of the files in the "latest" folder
Remove-Item ($path + "*.*") -Force
# For loop to check to see if we already have the image and, if not, download it
ForEach ($image in $images)
{
if(![System.IO.File]::Exists($path + $image)){
Write-Output "Downloading: " $image
Start-BitsTransfer -Source ($url + $image) -Destination $path -TransferType Download -RetryInterval 60
Start-Sleep 2
}
}
Get-BitsTransfer | Where-Object {$_.JobState -eq "Transferred"} | Complete-BitsTransfer
} else {
Write-Output "No images to download"}
I don't see any error handling in your code to resume/retry/restart on fail.
Meaning why is there no try/catch in the loop or the Get?
If the Get is on per download job in the loop, why is it outside the loop?
Download is the default for TransferType, so no need to specify, it normally will generate an error if you do.
So, something like this. I did test this, but never got a fail. Yet, I have a very high-speed speed internet connection. If you are doing this inside an enterprise, edge devices (filters, proxies, could also be slowing things down, potentially forcing timeouts.)
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed. Attempting a resume/retry" -Verbose
Get-BitsTransfer -Name $image | Resume-BitsTransfer
}
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
See the help files
Resume-BitsTransfer
Module: bitstransfer Resumes a BITS transfer job.
# Example 1: Resume all BITS transfer jobs owned by the current user
Get-BitsTransfer | Resume-BitsTransfer
# Example 2: Resume a new BITS transfer job that was initially suspended
$Bits = Start-BitsTransfer -DisplayName "MyJob" -Suspended
Add-BitsTransfer -BitsJob $Bits -ClientFileName C:\myFile -ServerFileName http://www.SomeSiteName.com/file1
Resume-BitsTransfer -BitsJob $Bits -Asynchronous
# Example 3: Resume the BITS transfer by the specified display name
Get-BitsTransfer -Name "TestJob01" | Resume-BitsTransfer
Here's a somewhat modified version of the above code. It appears the BITS transfer job object goes away when the error occurs, so there is no use in trying to find/resume that job. Instead, I wrapped the entire Try-Catch block in a while loop with an exit when the file is downloaded.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
$MaxRetries = 3 # Initialize the maximum number of retry attempts.
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object {
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<#
Check to make sure there are images available for download - arbitrarily
picked more than 2 $images
#>
if ($images.count -gt 2) {
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images) {
# Due to occasional failures to transfer, wrap the BITS transfer in a while loop
# re-initialize the exit counter for each new image
$retryCount = 0
while ($retryCount -le $MaxRetries){
Try {
Write-Verbose -Message "Downloading: $image" -Verbose
if (![System.IO.File]::Exists($path + $image)) {
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 2
}
# To get here, the transfer must have finished, so set the counter
# greater than the max value to exit the loop
$retryCount = $MaxRetries + 1
} # End Try block
Catch {
$PSItem.Exception.Message
$retryCount += 1
Write-Warning -Message "Download of $image not complete or failed. Attempting retry #: $retryCount" -Verbose
} # End Catch Block
} # End While loop for retries
} # End of loop over images
} # End of test for new images
else {
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
} # End of result for no new images
Here is a combination of the code that postanote provided and a Do-While loop to retry the download up to 5x if an error is thrown.
$url = 'https://www.nrlmry.navy.mil/archdat/global/stitched/MoS_2/navgem/wind_waves/latest/'
$path = 'D:\Temp\images\Wind_Waves\latest'
$site = Invoke-WebRequest -UseBasicParsing -Uri $url
# Create a table subset from the $site of all files returned with a .jpg extension
$table = $site.Links |
Where-Object{
$_.tagName -eq 'A' -and
$_.href.ToLower().EndsWith('jpg')
}
<#
# Create a list of all href items from the table & call it $images
Enumerate all of the images - for troubleshooting purposes - can be removed
Assign and display using variable squeezing
#>
($images = $table.href)
<# Check to make sure there are images available for download - arbitrarily
picked more than 2 $images #>
if($images.count -gt 2)
{
Remove-Item ($path + '*.*') -Force
ForEach ($image in $images)
{
# Create a Do-While loop to retry downloads up to 5 times if they fail
$Stoploop = $false
[int]$Retrycount = "0"
do{
Try
{
Write-Verbose -Message "Downloading: $image" -Verbose
if(![System.IO.File]::Exists($path + $image))
{
$StartBitsTransferSplat = #{
Source = ($url + $image)
Destination = $path
RetryInterval = 60
}
Start-BitsTransfer #StartBitsTransferSplat -ErrorAction Stop
Start-Sleep 10
$Stoploop = $true
}
Get-BitsTransfer |
Where-Object {$PSItem.JobState -eq 'Transferred'} |
Complete-BitsTransfer
}
Catch
{
if ($Retrycount -gt 5){
$PSItem.Exception.Message
Write-Warning -Message "Download of $image not complete or failed." -Verbose
$Stoploop = $true
}
else {
Write-Host "Could not download the image, retrying..."
Start-Sleep 10
$Retrycount = $Retrycount + 1
}
}
}
While ($Stoploop -eq $false)
}
}
else
{
Write-Warning -Message 'No images to download'
$PSItem.Exception.Message
}
I have this PowerShell script that I'm working on. CSV file is imported to get source and destination paths. The goal is to move files from a SFTP/FTP server into a destination and send an email report.
Task scheduler will run this code every hour. And if there's a new file, as email will be sent out.
It's almost done, but two things are missing:
Check if the file already exists and Body email seems empty: Getting the following error: Cannot validate argument on parameter 'Body'. The argument is null or empty. Provide an argument that is not
null or empty, and then try the command again.
I would like some assistance on how to check if the file exists and how to get this email if a new file was dropped and copied to the destination list
$SMTPBody = ""
$SMTPMessage = #{
"SMTPServer" = ""
"From" = ""
"To" = ""
"Subject" = "New File"
}
try {
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::sftp
HostName = ""
UserName = ""
Password = ""
PortNumber = "22"
FTPMode = ""
GiveUpSecurityAndAcceptAnySshHostKey = $true
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
Import-Csv -Path "D:\FILESOURCE.csv" -ErrorAction Stop | foreach {
$synchronizationResult = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, $_.Destination, $_.Source, $False)
$synchronizationResult.Check()
foreach ($download in $synchronizationResult.Downloads ) {
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
$SMTPBody +=
"`n Files: $($download.FileName -join ', ') `n" +
"Current Location: $($_.Destination)`n"
Send-MailMessage #SMTPMessage -Body $SMTPBody
}
$transferResult =
$session.GetFiles($_.Source, $_.Destination, $False, $transferOptions)
#Find the latest downloaded file
$latestTransfer =
$transferResult.Transfers |
Sort-Object -Property #{ Expression = { (Get-Item $_.Destination).LastWriteTime }
} -Descending |Select-Object -First 1
}
if ($latestTransfer -eq $Null) {
Write-Host "No files found."
$SMTPBody += "There are no new files at the moment"
}
else
{
$lastTimestamp = (Get-Item $latestTransfer.Destination).LastWriteTime
Write-Host (
"Downloaded $($transferResult.Transfers.Count) files, " +
"latest being $($latestTransfer.FileName) with timestamp $lastTimestamp.")
$SMTPBody += "file : $($latestTransfer)"
}
Write-Host "Waiting..."
Start-Sleep -Seconds 5
}
finally
{
Send-MailMessage #SMTPMessage -Body $SMTPBody
# Disconnect, clean up
$session.Dispose()
}
}
catch
{
Write-Host "Error: $($_.Exception.Message)"
}
I believe your code has more problems than you think.
Your combination of SynchronizeDirectories and GetFiles is suspicious. You first download only the new files by SynchronizeDirectories and then you download all files by GetFiles. I do not think you want that.
On any error the .Check call will throw and you will not collect the error into your report.
You keep sending partial reports by Send-MailMessage in the foreach loop
This is my take on your problem, hoping I've understood correctly what you want to implement:
$SMTPBody = ""
Import-Csv -Path "FILESOURCE.csv" -ErrorAction Stop | foreach {
Write-Host "$($_.Source) => $($_.Destination)"
$SMTPBody += "$($_.Source) => $($_.Destination)`n"
$synchronizationResult =
$session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, $_.Destination, $_.Source, $False)
$downloaded = #()
$failed = #()
$latestName = $Null
$latest = $Null
foreach ($download in $synchronizationResult.Downloads)
{
if ($download.Error -eq $Null)
{
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
$downloaded += $download.FileName
$ts = (Get-Item $download.Destination).LastWriteTime
if ($ts -gt $latest)
{
$latestName = $download.FileName;
$latest = $ts
}
}
else
{
Write-Host "File $($download.FileName) download failed" -ForegroundColor Red
$failed += $download.FileName
}
}
if ($downloaded.Count -eq 0)
{
$SMTPBody += "No new files were downloaded`n"
}
else
{
$SMTPBody +=
"Downloaded $($downloaded.Count) files:`n" +
($downloaded -join ", ") + "`n" +
"latest being $($latestName) with timestamp $latest.`n"
}
if ($failed.Count -gt 0)
{
$SMTPBody +=
"Failed to download $($failed.Count) files:`n" +
($failed -join ", ") + "`n"
}
$SMTPBody += "`n"
}
It will give you a report like:
/source1 => C:\dest1`
Downloaded 3 files:
/source1/aaa.txt, /source1/bbb.txt, /source1/ccc.txt
latest being /source1/ccc.txt with timestamp 01/29/2020 07:49:07.
/source2 => C:\dest2
Downloaded 1 files:
/source2/aaa.txt
latest being /source2/aaa.txt with timestamp 01/29/2020 07:22:37.
Failed to download 1 files:
/source2/bbb.txt
To check and make sure the csv file exists before you process the entire thing, you can use the Test-Path,
...
if (!(Test-Path D:\FileSource.csv)) {
Write-Output "No File Found"
$SMTPBody += "There are no new files at the moment"
return; # Dont run. file not found. Exit out.
}
Import-Csv -Path "D:\FILESOURCE.csv" -ErrorAction Stop | foreach {
...
and for the Body error you are getting, it is coming from the finally loop because there are cases where $SMTPBody would be null. This will no longer be an issue because $SMTPBody will have some text when file is not found at the beginning.
Even though you are using return in the if statement to check if the file exists, finally will always get executed. Since we updated $smtpbody, your Send-MailMessage will no longer error out.
Update
If you want to check if the file you are downloading already exists, you can use the if statement like this,
foreach ($download in $synchronizationResult.Downloads ) {
if (!(Test-Path Join-Path D: $download.FileName) {
$SMTPBody += "File $($download.Filename) already exists, skipping."
continue # will go to the next download...
}
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
...
If you do get the error regarding body, thats mostly because your script came across an exception and was sent straight over to finally statement. Finally statement sends the email with empty body because it was never set (due to exception). I would recommend using the debugger (step through) and see which step causes the exception and look into adding steps to make sure script doesnt fail.
I've written a script to recursively loop through every directory, find word files and append "CONFIDENTIAL" to the footer. This worked fine until it came across an encrypted file which caused the script to hang and when I clicked cancel on the password prompt, it caused the script to crash. I've attempted to
check if the document is encrypted before opening it but the prompt still opens which crashes the script. Is there a reliable way to check if the document is password protected that will work on .doc and .docx files? I've already tried using the code in the other thread and the first two methods don't work, the third method detects every file as encrypted because it throws an exception.
Current code:
$word = New-Object -ComObject Word.Application
$word.Visible = $false
$files = Get-ChildItem -Recurse C:\temp\FooterDocuments -include *.docx,*.doc
$restricted = "CONFIDENTIAL"
foreach ($file in $files) {
$filename = $file.FullName
Write-Host $filename
try {
$document = $word.Documents.Open($filename, $null, $null, $null, "")
if($document.ProtectionType -ne -1) {
$document.Close()
Write-Host "$filename is encrypted"
continue
}
} catch {
Write-Host "$filename is encrypted"
continue
}
foreach ($section in $document.Sections) {
$footer = $section.Footers.Item(1)
$footer.Range.Characters.Last.InsertAfter("`n" + $restricted)
}
$document.Save()
$document.Close()
}
$word.Quit()
You can just use get-content.
$filelist = dir c:\tmp\*.docx
foreach ($file in $filelist) {
[pscustomobject]#{
File = $file.FullName
HasPassword = [bool]((get-content $file.FullName) -match "http://schemas.microsoft.com/office/2006/keyEncryptor/password" )
}
}
sample output:
File HasPassword
---- -----------
C:\tmp\New Microsoft Word Document (2).docx False
C:\tmp\New Microsoft Word Document.docx True
I am using this simple function to download a file:
function DownloadFile([string]$url, [string]$file)
{
$clnt = new-object System.Net.WebClient
Write-Host "Downloading from $url to $file "
$clnt.DownloadFile($url, $file)
}
It works fine but the script I am using that calls it can be called many times and at present that can mean downloading the file(s) many times.
How can i modify the function to only download if the file doesn't exist locally or the server version is newer (e.g. the LastModifiedDate on the server is greater than the LastModifiedDate locally)?
EDIT:
This is what I've got so far, seems to work but would like not to have 2 calls to the server.
function DownloadFile([string]$url, [string]$file)
{
$downloadRequired = $true
if ((test-path $file))
{
$localModified = (Get-Item $file).LastWriteTime
$webRequest = [System.Net.HttpWebRequest]::Create($url);
$webRequest.Method = "HEAD";
$webResponse = $webRequest.GetResponse()
$remoteLastModified = ($webResponse.LastModified) -as [DateTime]
$webResponse.Close()
if ($remoteLastModified -gt $localModified)
{
Write-Host "$file is out of date"
}
else
{
$downloadRequired = $false
}
}
if ($downloadRequired)
{
$clnt = new-object System.Net.WebClient
Write-Host "Downloading from $url to $file"
$clnt.DownloadFile($url, $file)
}
else
{
Write-Host "$file is up to date."
}
}
I've been beating this up this week, and came up with this
# ----------------------------------------------------------------------------------------------
# download a file
# ----------------------------------------------------------------------------------------------
Function Download-File {
Param (
[Parameter(Mandatory=$True)] [System.Uri]$uri,
[Parameter(Mandatory=$True )] [string]$FilePath
)
#Make sure the destination directory exists
#System.IO.FileInfo works even if the file/dir doesn't exist, which is better then get-item which requires the file to exist
If (! ( Test-Path ([System.IO.FileInfo]$FilePath).DirectoryName ) ) { [void](New-Item ([System.IO.FileInfo]$FilePath).DirectoryName -force -type directory)}
#see if this file exists
if ( -not (Test-Path $FilePath) ) {
#use simple download
[void] (New-Object System.Net.WebClient).DownloadFile($uri.ToString(), $FilePath)
} else {
try {
#use HttpWebRequest to download file
$webRequest = [System.Net.HttpWebRequest]::Create($uri);
$webRequest.IfModifiedSince = ([System.IO.FileInfo]$FilePath).LastWriteTime
$webRequest.Method = "GET";
[System.Net.HttpWebResponse]$webResponse = $webRequest.GetResponse()
#Read HTTP result from the $webResponse
$stream = New-Object System.IO.StreamReader($webResponse.GetResponseStream())
#Save to file
$stream.ReadToEnd() | Set-Content -Path $FilePath -Force
} catch [System.Net.WebException] {
#Check for a 304
if ($_.Exception.Response.StatusCode -eq [System.Net.HttpStatusCode]::NotModified) {
Write-Host " $FilePath not modified, not downloading..."
} else {
#Unexpected error
$Status = $_.Exception.Response.StatusCode
$msg = $_.Exception
Write-Host " Error dowloading $FilePath, Status code: $Status - $msg"
}
}
}
}
Last modified is in the HTTP response headers.
Try this:
$clnt.OpenRead($Url).Close();
$UrlLastModified = $clnt.ResponseHeaders["Last-Modified"];
If that's newer than the date on your file, your file is old.
The remote server doesn't have to respond with an accurate date or with the file's actual last modified date, but many will.
GetWebResponse() might be a better way to do this (or more correct way). Using OpenRead() and then Close() immediately afterwards bothers my sensibilities, but I may be crazy. I do mostly work on databases.
# If the local directory exists and it gets a response from the url,
# it checks the last modified date of the remote file. If the file
# already exists it compares the date of the file to the file from
# the url. If either the file doesn't exists or has a newer date, it
# downloads the file and modifies the file's date to match.
function download( $url, $dir, $file ) {
if( Test-Path $dir -Ea 0 ) {
$web = try { [System.Net.WebRequest]::Create("$url/$file").GetResponse() } catch [Net.WebException] {}
if( $web.LastModified ) {
$download = 0
if(-Not(Test-Path "$dir\$file" -Ea 0)) { $download = 1 }
elseif((gi "$dir\$file").LastWriteTime -ne $web.LastModified) { $download = 1 }
if( $download ) {
Invoke-WebRequest "$url/$file" -OutFile "$dir\$file" | Wait-Process
(gi "$dir\$file").LastWriteTime = $web.LastModified
}
$web.Close()
}
}
}
download "https://website.com" "$env:systemdrive" "file.txt"