Renaming .msg files using Powershell - powershell

We tend to drag and drop messages from outlook into Windows Explorer so I need to rename the default message filename so that the files are searchable/readable from Explorer.
I have managed to put together the following code that almost renames an outlook file on a network folder from the default "Subject.msg" to "To - Subject - ReceivedDate - hhmmss.msg". The only problem is that the rename step does not work as I believe the Outlook process is locking the file. I would appreciate help to avoid the locking and rename the files? Also, I am not sure what happens if there are multiple people in the To list, I would be happy to take the first name in the To list? Here is my effort:
$olMailItemPath = "W:\Marketing\Prospects\Emails\*"
Write-Host $olMailItemPath
$SourceFiles = Get-Item -path $olMailItemPath -include *.msg
$outlook = New-Object -comobject outlook.application
$namespace = $outlook.GetNamespace("MAPI")
function cleanName($aname)
{
$aname = $aname -replace "'"
$aname = $aname -replace ":"
$aname = $aname -replace "#"
$aname = $aname -replace "-"
return ($aname.trim())
}
function cleanSubject($subject)
{
$subject = $subject -replace 'Re:'
$subject = $subject
return (' - ' + $subject.trim() + ' - ')
}
foreach ($msg in $SourceFiles){
$olMailItem = $NameSpace.OpenSharedItem($msg)
$EmailTo = $olMailItem.To
$EmailSubject = $olMailItem.Subject
$DateRecieved = $olMailItem.ReceivedTime
$newfilename = (cleanName($EmailTo)) + (cleanSubject($EmailSubject)) + $DateRecieved.ToString("yyyyMMdd - hhmmss") + ".msg"
# Write-Host "Email Sent To: $EmailTo "
# Write-Host "Subject: $EmailSubject "
# Write-Host "Date Recieved: $DateRecieved"
Write-Host $msg
Write-Host $newfilename
Rename-Item $msg $newfilename
}
p.s. [Inserted # 18 Jun 2013] In answer to Athom, I know Outlook is locking the file as I get the following error:
Rename-Item : The process cannot access the file because it is being used by another process.
At C:\Users\Bertie\Dropbox\Programming\Powershell\Rename Outlook Messages.ps1:41 char:16
+ Rename-Item <<<< -path $msg -newname $newFileName
+ CategoryInfo : WriteError: W:\Marketing\Prospects\Emails\new.msg:String) [Rename-Item], IOException
+ FullyQualifiedErrorId : RenameItemIOError,Microsoft.PowerShell.Commands.RenameItemCommand
However, when I close outlook (which is initiated by the powershell script), I can then run the Rename-Item command and it run's successfully.

How's this?
Essentially the changes I have mades are:
Your renaming loop now throws its output to a hashtable.
Stop-Process kills Outlook.
Another loop then does the renaming.
# Declare the hastable to store the names, then generate the names.
$nameHash = #{}; Foreach ($msg in $SourceFiles){
# Do the Outlook thing
$olMailItem = $NameSpace.OpenSharedItem($msg)
$EmailTo = $olMailItem.To
$EmailSubject = $olMailItem.Subject
$DateRecieved = $olMailItem.ReceivedTime
$newfilename = (cleanName($EmailTo)) + (cleanSubject($EmailSubject)) + $DateRecieved.ToString("yyyyMMdd - hhmmss") + ".msg"
# Write-Host "Email Sent To: $EmailTo "
# Write-Host "Subject: $EmailSubject "
# Write-Host "Date Recieved: $DateRecieved"
# Store the names
$nameHash.Add("$msg","$newfilename")
}
# Kill Outlook.. Then wait....
Stop-Process -Name Outlook -Force
Start-Sleep -m 500 # You might be able to remove this - depends how beefy your CPU is.
# Rename
ForEach ($item in $nameHash.GetEnumerator()) {
# Testing >>-->
echo $item.Name
echo $item.Value
# <--<< Testing
Rename-Item $item.Name $item.Value
}

Related

Service name missing in task, present in powershell

I have worked on a powershell script to check if certain important services are running on the server and email me the results. When I test the script and run it in powershell, the emails come through correctly. However when I set up the task on the server and have it run the script, the emails that arrive are now all missing the service names.
Email when the script is run in powershell:
At 12-11-20 11:06, the [service] is ALL GOOD on [server].
Email when the script is run through tasks:
At 12-11-20 11:00, the is on [server].
(services and server names redacted)
Any ideas on what syntax I'm missing that it works in the testing, but not in the actual execution?
My script code (with some details redacted):
$DateTime = Get-Date -Format "MM-dd-yy hh:mm"
#Check if service is running
$servName = "[service I need to check]"
$serv = Get-Service | Where-Object { $_.DisplayName -eq $servName}
$Subject =$Serv.Name + " check on [server]"
If ($serv.Status -ne "Running") {
$Body = "At $DateTime, the " + $serv.DisplayName + " service is " + $serv.Status + " on [server]."
}
Else { $Body = "At $DateTime, the " + $serv.DisplayName + " is ALL GOOD on [server]." }
[location]\EmailAlert.ps1 -Recipient me#work.com -Subject $Subject -Body $Body
$servName = "[another service to check]"
$serv = Get-Service | Where-Object { $_.DisplayName -eq $servName}
$Subject =$Serv.Name + " check on [server]"
If ($serv.Status -ne "Running") {
$Body = "At $DateTime, the " + $serv.DisplayName + " service is " + $serv.Status + " on [server]."
}
Else { $Body = "At $DateTime, the " + $serv.DisplayName + " is ALL GOOD on [server]." }
[location]\EmailAlert.ps1 -Recipient me#work.com -Subject $Subject -Body $Body

Exchange Management Shell - Exchange 2010 - Exporting with DateRange to PST ignoring the filtering

I’ll elaborate on the title a bit.
I’m currently creating a script to export a mailbox on Exchange 2010 to a PST, only emails from a specific date range.
However, it seems to be ignoring the filter and exporting all 37gb to a PST.
I’m creating the script to prevent having to do it in the future. I will post the script below, as it is all relevant to the issue due to variables etc.
# / Sets to US Date Values \ #
[System.Reflection.Assembly]::LoadWithPartialName("System.Threading")
[System.Reflection.Assembly]::LoadWithPartialName("System.Globalization")
[System.Threading.Thread]::CurrentThread.CurrentCulture = [System.Globalization.CultureInfo]::CreateSpecificCulture("en-us")
# / This Loads The Assemblies Required for Data Input for the future Parameters \ #
[System.Reflection.Assembly]::LoadWithPartialName("System.Windows.Forms") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("System.Drawing") | Out-Null
[System.Windows.Forms.Application]::EnableVisualStyles() | Out-Null
# / This Loads The Pop-up Data Input Windows For Creating the Parameters \ #
Add-Type -AssemblyName Microsoft.VisualBasic
# / Parameter Creation Using Nice GUI Pop-Up Windows \ #
$User = [Microsoft.VisualBasic.Interaction]::InputBox('What Is The Email Account?', 'Email Address', "Email#Email.com")
$StartDateData = [Microsoft.VisualBasic.Interaction]::InputBox('What Is The Start Date (US Date Format)', 'Start Date', "12/25/1900")
$EndDateData = [Microsoft.VisualBasic.Interaction]::InputBox('What Is The End Date (US Date Format)', 'End Date', "12/25/1900")
$Path = [Microsoft.VisualBasic.Interaction]::InputBox('Specify Where You Want The PST to Be Saved (Full UNC Path WITH Trailing Slash)', 'Path', "C:\Users\%USERPROFILE%\Desktop\")
# / Export Mailbox \ #
cls
Write-Host ''
Write-Host ''
Write-Host ''
Write-Host ''
Write-Host ''
Write-Host ''
Write-Host 'Is this Data Correct?'
Write-Host ''
Write-Host ''
Write-Host $User
Write-host $StartDateData
Write-host $EndDateData
write-host $Path
Write-Host ''
Write-Host ''
Write-Host ''
Write-Host "Do You Want To Continue? (Y/N)"
$response = Read-Host
if ( $response -ne "Y" ) {
exit
}
cls
# / Sets The Path Parameter \ #
$PSTPath = $Path + $User + ".pst"
# / Sets The Date Parameter \ #
$StartDate = "'" + $StartDateData + "'"
$EndDate = "'" + $EndDateData + "'"
# Use This if the Below Doesn't Work - Export-Mailbox -Identity $User -StartDate $StartDate -EndDate $EndDate -PstFolderPath $PSTPath
# gt = Greater-Than
# ge = Greater-Than-Or-Equal-To
# lt = Less-Than
# le = Less-Than-Or-Equal-To
$Request = New-MailboxExportRequest -Mailbox $User -ContentFilter {(Received -ge $StartDate) -and (Received -le $EndDate)} -FilePath $PSTPath
$Status = ( Get-MailboxExportRequestStatistics -Identity $Request ).Status.ToString().Trim()
while( $Status -ne 'Completed' ){
Start-Sleep 10
$Status = ( Get-MailboxExportRequestStatistics -Identity $Request ).Status.ToString().Trim()
Write-Verbose "Current Export Status: $Status" -Verbose
}a
Write-Verbose "$Mailbox exported" -Verbose
Apologies about the bulk, I can’t personally see an error.
Ended up fixing this myself after A LOT of playing around.
Had to throw the filter into it's own parameter, like so
$filter = "(Received -ge" + " " + $StartDate + ") -and (Received -le" + " " + $EndDate + ")"
then pipe it into the command
$Request = New-MailboxExportRequest -ContentFilter $filter -Mailbox $User -Name $ReqName -FilePath $PSTPath
This has allowed it to progress

Powershell, System.Diagnostics.Process & exiftool stop working when dealing with hundreds of commands

I created a tool (to be precise: a Powershell script) that helps me with converting pictures in folders, i.e. it looks for all files of a certain ending (say, *.TIF) and converts them to JPEGs via ImageMagick. It then transfers some EXIF, IPTC and XMP information from the source image to the JPEG via exiftool:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# + + + + The problem occurs somewhere after this line + + + +
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
# no difference using start-sleep:
# Start-Sleep -Milliseconds 25
}
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolproc.StandardOutput.ReadToEnd()
$outputout = $outputout -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
If you want to reproduce but do not have/want that many files, there is also a simpler way: let exiftool print out its version number 600 times:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
for($i=0; $i -lt 600; $i++){
try{
$exiftoolproc.StandardInput.WriteLine("-ver`n-execute`n")
Write-Output "Success:`t$i"
}catch{
Write-Output "Failed:`t$i"
}
}
# close exiftool:
try{
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
}catch{
Write-Output "Could not close exiftool!"
}
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[array]$outputout = #($exiftoolproc.StandardOutput.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
As far as I could test, it all goes well, as long as you stay < 115 files. If you go above, the 114th JPEG gets proper metadata, but exiftool stops to work after this one - it idles, and my script does so, too. I can reproduce this with different files, paths, and exiftool commands.
Neither the StandardOutput nor the StandardError show any irregularities even with exiftool's -verbose-flag - of course, they would not, as I have to kill exiftool to get them to show up.
Running ISE's / VSCode's debugger shows nothing. Exiftool's window (only showing up when debugging) shows nothing.
Is there some hard limit on commands run with System.Diagnostics.Process, is this a problem with exiftool or is this simply due to my incompetence to use something outside the most basic Powershell cmdlets? Or maybe the better question would be: How can I properly debug this?
Powershell is 5.1, exiftool is 10.80 (production) - 10.94 (latest).
After messing around with different variants of $ArgList, I found out that there is no difference when using different file commands, but using commands that produce less StdOut (like -ver) resulted in more iterations. Therefore, I took an educated guess that the output buffer is the culprit.
As per Mark Byers' answer to "ProcessStartInfo hanging on “WaitForExit”? Why?":
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full. [...]
The solution is to use asynchronous reads to ensure that the buffer doesn't get full.
Then, it was just a matter of searching for the right things. I found that Alexander Obersht's answer to "How to capture process output asynchronously in powershell?" provides almost everything that I needed.
The script now looks like this:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
# + + + + NEW STUFF (1/2) HERE: + + + +
# Creating process object.
$exiftoolproc = New-Object -TypeName System.Diagnostics.Process
$exiftoolproc.StartInfo = $psi
# Creating string builders to store stdout and stderr.
$exiftoolStdOutBuilder = New-Object -TypeName System.Text.StringBuilder
$exiftoolStdErrBuilder = New-Object -TypeName System.Text.StringBuilder
# Adding event handers for stdout and stderr.
$exiftoolScripBlock = {
if (-not [String]::IsNullOrEmpty($EventArgs.Data)){
$Event.MessageData.AppendLine($EventArgs.Data)
}
}
$exiftoolStdOutEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'OutputDataReceived' -MessageData $exiftoolStdOutBuilder
$exiftoolStdErrEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'ErrorDataReceived' -MessageData $exiftoolStdErrBuilder
[Void]$exiftoolproc.Start()
$exiftoolproc.BeginOutputReadLine()
$exiftoolproc.BeginErrorReadLine()
# + + + + END OF NEW STUFF (1/2) + + + +
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
}
# + + + + NEW STUFF (2/2) HERE: + + + +
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
$exiftoolproc.WaitForExit()
# Unregistering events to retrieve process output.
Unregister-Event -SourceIdentifier $exiftoolStdOutEvent.Name
Unregister-Event -SourceIdentifier $exiftoolStdErrEvent.Name
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolStdErrBuilder.ToString().Trim().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolStdOutBuilder.ToString().Trim() -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
# + + + + END OF NEW STUFF (2/2) + + + +
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
I can confirm that it works for many, many files (at least 1600).

Shortcut to add files into local sql database

I would like to be able to right click on a file(s) and "send-to" a local MSSQL database. The details are that I would like to store the file contents in "contents" column and the file name in the "filename" column ... how novel :)
*In most cases the file contents is HTML.
It seems like it should be possible through windows shell/SQL Shell using a shortcut to a command in the "shell:sendto" folder.
[System.Reflection.Assembly]::LoadWithPartialName('Microsoft.SqlServer.SMO') | Out-Null
$Server1 = New-Object ("Microsoft.SqlServer.Management.Smo.Server") 'SQLSERVER'
$Server1.databases["DB"].tables["Table"].rowcount
$RowCount = $server1.databases["DB"].tables["Table"].rowcount.ToString()
$TotalRecords = [int]$RowCount
$wc = New-Object system.net.WebClient
$url = ""
$files = #(Get-ChildItem c:\test\*.*)
"Number of files $($files.length)"
# Errors out when no files are found
if($files.length -lt 1) { return }
foreach($file1 in $files) {
# $txt = Get-Content($file1)
# $txt = $txt.Replace("'", "''")
# Write-Host $file1.name + " - - " + $Txt
$url1 = $url + $file1
Write-Host("URL is " + $url1)
$webpage = $wc.DownloadData($url1)
$string = [System.Text.Encoding]::ASCII.GetString($webpage)
$string = $string.Replace("'", "''")
Invoke-SqlCmd -ServerInstance SERVER -Query "Insert into DATABASE.dbo.Table(text,filename) Values ('$string','$file1')"}

Powershell FTP Send large file System.OutOfMemoryException

I have a script partially based on the one here: Upload files with FTP using PowerShell
It all works absolutely fine with tiny files but I am trying to use it to make the process we use for exporting access mdb files to clients that only have ftp more robust.
My first test involved a 10MB file and I ran into a System.OutOfMemoryException at the Get-Content stage
The powershell ISE was running to nearly 2GIG usage during the get attempt.
Here is a full script sample (Be gentle. I am fairly new to it):
#####
# User variables to control the script
#####
# How many times connection will be re-tried
$connectionTries = 5
#time between tries in seconds
$connectionTryInterval = 300
#Where to log the output
$logFile = "D:\MyPath\ftplog.txt"
#maximum log file size in KB before it is archived
$logFileMaxSize = 500
#formatted date part for the specific file to transfer
#This is appended to the filename base. Leave as "" for none
$datePart = ""
#base part of the file name
$fileNameBase = "Myfile"
#file extension
$fileExtension = ".mdb"
#location of the source file (please include trailing backslash)
$sourceLocation = "D:\MyPath\"
#location and credentials of the target ftp server
$userName = "iamafish"
$password = "ihavenofingers"
$ftpServer = "10.0.1.100"
######
# Main Script
#####
#If there is a log file and it is longer than the declared limit then archive it with the current timestamp
if (test-path $logfile)
{
if( $((get-item $logFile).Length/1kb) -gt $logFileMaxSize)
{
write-host $("archiving log to ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
rename-item $logFile $("ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
}
}
#start new log entry
#Add-Content $logFile "___________________________________________________________"
#write-host $logEntry
#contruct source file and destination uri
$fileName = $fileNameBase + $datePart + $fileExtension
$sourceFile = $sourceLocation + $fileName
$sourceuri = "ftp://" + $ftpServer + "/" + $fileName
# Create a FTPWebRequest object to handle the connection to the ftp server
$ftprequest = [System.Net.FtpWebRequest]::create($sourceuri)
# set the request's network credentials for an authenticated connection
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftprequest.UseBinary = $true
$ftprequest.KeepAlive = $false
$succeeded = $true
$errorMessage = ""
# read in the file to upload as a byte array
trap [exception]{
$script:succeeded = $false
$script:errorMessage = $_.Exception.Message
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|" + $_.Exception.Message)
#write-host $logEntry
#write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
#write-host $("TRAPPED: " + $_.Exception.Message)
exit
}
#The -ea 1 forces the error to be trappable
$content = gc -en byte $sourceFile -ea 1
$try = 0
do{
trap [System.Net.WebException]{
$script:succeeded = $false
$script:errorMessage = $_.Exception.Message
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|" + $_.Exception.Message)
#write-host $logEntry
#write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
$script:try++
start-sleep -s $connectionTryInterval
continue
}
$ftpresponse = $ftprequest.GetResponse()
} while(($try -le $connectionTries) -and (-not $succeeded))
if ($succeeded) {
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|0|" + "Starting file transfer.")
# get the request stream, and write the bytes into it
$rs = $ftprequest.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
$content.Close()
$content.Dispose()
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|0|" + "Transfer complete.")
#write-host $logEntry
}
I can't put code in comments so, thanks to pointers from keith I have moved the file acces bit down to the bottom to link it with the other like so..
trap [Exception]{
$script:succeeded = $false
$script:errorMessage = $_.Exception.Message
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|Check File Connection|" + $_.Exception.Message)
$sourceStream.Close()
$sourceStream.Dispose()
#write-host $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|Attempt to open file|" + $_.Exception.Message)
#write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
exit
}
$sourceStream = New-Object IO.FileStream ($(New-Object System.IO.FileInfo $sourceFile),[IO.FileMode]::Open)
[byte[]]$readbuffer = New-Object byte[] 1024
# get the request stream, and write the bytes into it
$rs = $ftprequest.GetRequestStream()
do{
$readlength = $sourceStream.Read($readbuffer,0,1024)
$rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)
I just need to work out why I get: Exception calling "GetResponse" with "0" argument(s): "Cannot access a disposed object.
every other time I run it. Is this a quirk of running it in the ISE or am I doing somethign drasically wrong with either initial declaration or final disposing?
I'll post the full final script when done since I think it will make a nice sturdy ftp export example with error trapping and logging.
OK, here is the full script. Dispose is edited out but with or without it runnign the script within 5 minutes will either get me a message that I cannot use a disposed opject or tell me that the getResponse() has produced an error (226) File transfered (running in ISE). Whilst this will not be a problem during normal opperation I would like to correctly log oout of the FTP session and clean the resources at the end of the script and ensure I am correctly declaring them as needed.
#####
# User variables to control the script
#####
# How many times connection will be re-tried
$connectionTries = 5
#time between tries in seconds
$connectionTryInterval = 1
#Where to log the output
$logFile = "D:\MyPath\ftplog.txt"
#maximum log file size in KB before it is archived
$logFileMaxSize = 500
#log to file or console - #true=log to file, #false = log to console
$logToFile=$false
#formatted date part for the specific file to transfer
#This is appended to the filename base. Leave as "" for none
$datePart = ""
#base part of the file name
$fileNameBase = "MyFile"
#file extension
$fileExtension = ".mdb"
#location of the source file (please include trailing backslash)
$sourceLocation = "D:\MyPath\"
#location and credentials of the target ftp server
$userName = "iamafish"
$password = "ihavenofingers"
$ftpServer = "10.0.1.100"
######
# Main Script
#####
function logEntry($entryType, $section, $message)
{
#just to make a one point switch for logging to console for testing
# $entryType: 0 = success, 1 = Error
# $section: The section of the script the log entry was generated from
# $message: the log message
#This is pipe separated to fit in with my standard MSSQL linked flat file schema for easy querying
$logString = "$(get-Date -format "yyyy-MM-dd hh:mm:ss")|$entryType|$section|$message"
if($script:logtoFile)
{
Add-Content $logFile $logString
}
else
{
write-host $logString
}
}
#If there is a log file and it is longer than the declared limit then archive it with the current timestamp
if (test-path $logfile)
{
if( $((get-item $logFile).Length/1kb) -gt $logFileMaxSize)
{
write-host $("archiving log to ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
rename-item $logFile $("ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
New-Item $logFile -type file
}
}
else
{
New-Item $logFile -type file
}
#contruct source file and destination uri
$fileName = $fileNameBase + $datePart + $fileExtension
$sourceFile = $sourceLocation + $fileName
$destination = "ftp://" + $ftpServer + "/" + $fileName
#Check if the source file exists
if ((test-path $sourceFile) -eq $false)
{
logEntry 1 "Check Source File" $("File not found: " + $sourceFile)
Exit
}
# Create a FTPWebRequest object to handle the connection to the ftp server
$ftpRequest = [System.Net.FtpWebRequest]::create($destination)
# set the request's network credentials for an authenticated connection
$ftpRequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftpRequest.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftpRequest.UseBinary = $true
$ftpRequest.KeepAlive = $false
$succeeded = $true
$try = 1
do{
trap [Exception]{
$script:succeeded = $false
logEntry 1 "Check FTP Connection" $_.Exception.Message
$script:try++
start-sleep -s $connectionTryInterval
continue
}
$ftpResponse = $ftpRequest.GetResponse()
} while(($try -le $connectionTries) -and (-not $succeeded))
if ($succeeded) {
logEntry 0 "Connection to FTP" "Success"
# Open a filestream to the source file
trap [Exception]{
logEntry 1 "Check File Connection" $_.Exception.Message
$sourceStream.Close()
$ftpResponse.Close()
exit
}
$sourceStream = New-Object IO.FileStream ($(New-Object System.IO.FileInfo $sourceFile),[IO.FileMode]::Open)
[byte[]]$readbuffer = New-Object byte[] 1024
logEntry 0 "Starting file transfer" "Success"
# get the request stream, and write the bytes into it
$rs = $ftpRequest.GetRequestStream()
do{
$readlength = $sourceStream.Read($readbuffer,0,1024)
$rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)
logEntry 0 "Transfer complete" "Success"
# be sure to clean up after ourselves
$rs.Close()
#$rs.Dispose()
$sourceStream.Close()
#$sourceStream.Dispose()
}
$ftpResponse.Close()
Example of trying to trap the Transfer OK response at the end:
logEntry 0 "Starting file transfer" "Success"
# get the request stream, and write the bytes into it
$rs = $ftpRequest.GetRequestStream()
do{
$readlength = $sourceStream.Read($readbuffer,0,1024)
$rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)
$rs.Close()
#start-sleep -s 2
trap [Exception]{
$script:succeeded = $false
logEntry 1 "Check FTP Connection" $_.Exception.Message
continue
}
$ftpResponse = $ftpRequest.GetResponse()
Having hit a similar issue myself with RAM usage hitting the GB's uploading a 3MB file, I found that replacing:
$content = gc -en byte $sourceFile
With:
$content = [System.IO.File]::ReadAllBytes($sourceFile)
Gives much better performance. As mentioned elsewhere, chunking would be a better solution for really large files, as then you're not holding the whole file in memory at once, but the code above at least only consumes ~(size of file) bytes of RAM, which means it should be good up to the ~10s of MB kind of range.
Rather than read the whole file into memory using Get-Content, try reading it in a chunk at a time and writing it to the FTP request stream. I would use one of the lower level .NET file stream APIs to do the reading. Admittedly, you wouldn't think a 10MB would pose a memory problem though.
Also, make sure you get the response after geting the request stream and writing to it. The get of the response stream is what uploads the data. From the docs:
When using an FtpWebRequest object to
upload a file to a server, you must
write the file content to the request
stream obtained by calling the
GetRequestStream method or its
asynchronous counterparts, the
BeginGetRequestStream and
EndGetRequestStream methods. You must
write to the stream and close the
stream before sending the request.
Requests are sent to the server by
calling the GetResponse method or its
asynchronous counterparts, the
BeginGetResponse and EndGetResponse
methods. When the requested operation
completes, an FtpWebResponse object is
returned. The FtpWebResponse object
provides the status of the operation
and any data downloaded from the
server.