Powershell FTP Send large file System.OutOfMemoryException - powershell

I have a script partially based on the one here: Upload files with FTP using PowerShell
It all works absolutely fine with tiny files but I am trying to use it to make the process we use for exporting access mdb files to clients that only have ftp more robust.
My first test involved a 10MB file and I ran into a System.OutOfMemoryException at the Get-Content stage
The powershell ISE was running to nearly 2GIG usage during the get attempt.
Here is a full script sample (Be gentle. I am fairly new to it):
#####
# User variables to control the script
#####
# How many times connection will be re-tried
$connectionTries = 5
#time between tries in seconds
$connectionTryInterval = 300
#Where to log the output
$logFile = "D:\MyPath\ftplog.txt"
#maximum log file size in KB before it is archived
$logFileMaxSize = 500
#formatted date part for the specific file to transfer
#This is appended to the filename base. Leave as "" for none
$datePart = ""
#base part of the file name
$fileNameBase = "Myfile"
#file extension
$fileExtension = ".mdb"
#location of the source file (please include trailing backslash)
$sourceLocation = "D:\MyPath\"
#location and credentials of the target ftp server
$userName = "iamafish"
$password = "ihavenofingers"
$ftpServer = "10.0.1.100"
######
# Main Script
#####
#If there is a log file and it is longer than the declared limit then archive it with the current timestamp
if (test-path $logfile)
{
if( $((get-item $logFile).Length/1kb) -gt $logFileMaxSize)
{
write-host $("archiving log to ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
rename-item $logFile $("ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
}
}
#start new log entry
#Add-Content $logFile "___________________________________________________________"
#write-host $logEntry
#contruct source file and destination uri
$fileName = $fileNameBase + $datePart + $fileExtension
$sourceFile = $sourceLocation + $fileName
$sourceuri = "ftp://" + $ftpServer + "/" + $fileName
# Create a FTPWebRequest object to handle the connection to the ftp server
$ftprequest = [System.Net.FtpWebRequest]::create($sourceuri)
# set the request's network credentials for an authenticated connection
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftprequest.UseBinary = $true
$ftprequest.KeepAlive = $false
$succeeded = $true
$errorMessage = ""
# read in the file to upload as a byte array
trap [exception]{
$script:succeeded = $false
$script:errorMessage = $_.Exception.Message
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|" + $_.Exception.Message)
#write-host $logEntry
#write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
#write-host $("TRAPPED: " + $_.Exception.Message)
exit
}
#The -ea 1 forces the error to be trappable
$content = gc -en byte $sourceFile -ea 1
$try = 0
do{
trap [System.Net.WebException]{
$script:succeeded = $false
$script:errorMessage = $_.Exception.Message
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|" + $_.Exception.Message)
#write-host $logEntry
#write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
$script:try++
start-sleep -s $connectionTryInterval
continue
}
$ftpresponse = $ftprequest.GetResponse()
} while(($try -le $connectionTries) -and (-not $succeeded))
if ($succeeded) {
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|0|" + "Starting file transfer.")
# get the request stream, and write the bytes into it
$rs = $ftprequest.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
$content.Close()
$content.Dispose()
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|0|" + "Transfer complete.")
#write-host $logEntry
}
I can't put code in comments so, thanks to pointers from keith I have moved the file acces bit down to the bottom to link it with the other like so..
trap [Exception]{
$script:succeeded = $false
$script:errorMessage = $_.Exception.Message
Add-Content $logFile $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|Check File Connection|" + $_.Exception.Message)
$sourceStream.Close()
$sourceStream.Dispose()
#write-host $((get-Date -format "yyyy-MM-dd hh:mm:ss") + "|1|Attempt to open file|" + $_.Exception.Message)
#write-host $("TRAPPED: " + $_.Exception.GetType().FullName)
exit
}
$sourceStream = New-Object IO.FileStream ($(New-Object System.IO.FileInfo $sourceFile),[IO.FileMode]::Open)
[byte[]]$readbuffer = New-Object byte[] 1024
# get the request stream, and write the bytes into it
$rs = $ftprequest.GetRequestStream()
do{
$readlength = $sourceStream.Read($readbuffer,0,1024)
$rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)
I just need to work out why I get: Exception calling "GetResponse" with "0" argument(s): "Cannot access a disposed object.
every other time I run it. Is this a quirk of running it in the ISE or am I doing somethign drasically wrong with either initial declaration or final disposing?
I'll post the full final script when done since I think it will make a nice sturdy ftp export example with error trapping and logging.
OK, here is the full script. Dispose is edited out but with or without it runnign the script within 5 minutes will either get me a message that I cannot use a disposed opject or tell me that the getResponse() has produced an error (226) File transfered (running in ISE). Whilst this will not be a problem during normal opperation I would like to correctly log oout of the FTP session and clean the resources at the end of the script and ensure I am correctly declaring them as needed.
#####
# User variables to control the script
#####
# How many times connection will be re-tried
$connectionTries = 5
#time between tries in seconds
$connectionTryInterval = 1
#Where to log the output
$logFile = "D:\MyPath\ftplog.txt"
#maximum log file size in KB before it is archived
$logFileMaxSize = 500
#log to file or console - #true=log to file, #false = log to console
$logToFile=$false
#formatted date part for the specific file to transfer
#This is appended to the filename base. Leave as "" for none
$datePart = ""
#base part of the file name
$fileNameBase = "MyFile"
#file extension
$fileExtension = ".mdb"
#location of the source file (please include trailing backslash)
$sourceLocation = "D:\MyPath\"
#location and credentials of the target ftp server
$userName = "iamafish"
$password = "ihavenofingers"
$ftpServer = "10.0.1.100"
######
# Main Script
#####
function logEntry($entryType, $section, $message)
{
#just to make a one point switch for logging to console for testing
# $entryType: 0 = success, 1 = Error
# $section: The section of the script the log entry was generated from
# $message: the log message
#This is pipe separated to fit in with my standard MSSQL linked flat file schema for easy querying
$logString = "$(get-Date -format "yyyy-MM-dd hh:mm:ss")|$entryType|$section|$message"
if($script:logtoFile)
{
Add-Content $logFile $logString
}
else
{
write-host $logString
}
}
#If there is a log file and it is longer than the declared limit then archive it with the current timestamp
if (test-path $logfile)
{
if( $((get-item $logFile).Length/1kb) -gt $logFileMaxSize)
{
write-host $("archiving log to ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
rename-item $logFile $("ftplog_" + (get-date -format yyyyMMddhhmmss) + ".txt")
New-Item $logFile -type file
}
}
else
{
New-Item $logFile -type file
}
#contruct source file and destination uri
$fileName = $fileNameBase + $datePart + $fileExtension
$sourceFile = $sourceLocation + $fileName
$destination = "ftp://" + $ftpServer + "/" + $fileName
#Check if the source file exists
if ((test-path $sourceFile) -eq $false)
{
logEntry 1 "Check Source File" $("File not found: " + $sourceFile)
Exit
}
# Create a FTPWebRequest object to handle the connection to the ftp server
$ftpRequest = [System.Net.FtpWebRequest]::create($destination)
# set the request's network credentials for an authenticated connection
$ftpRequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftpRequest.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftpRequest.UseBinary = $true
$ftpRequest.KeepAlive = $false
$succeeded = $true
$try = 1
do{
trap [Exception]{
$script:succeeded = $false
logEntry 1 "Check FTP Connection" $_.Exception.Message
$script:try++
start-sleep -s $connectionTryInterval
continue
}
$ftpResponse = $ftpRequest.GetResponse()
} while(($try -le $connectionTries) -and (-not $succeeded))
if ($succeeded) {
logEntry 0 "Connection to FTP" "Success"
# Open a filestream to the source file
trap [Exception]{
logEntry 1 "Check File Connection" $_.Exception.Message
$sourceStream.Close()
$ftpResponse.Close()
exit
}
$sourceStream = New-Object IO.FileStream ($(New-Object System.IO.FileInfo $sourceFile),[IO.FileMode]::Open)
[byte[]]$readbuffer = New-Object byte[] 1024
logEntry 0 "Starting file transfer" "Success"
# get the request stream, and write the bytes into it
$rs = $ftpRequest.GetRequestStream()
do{
$readlength = $sourceStream.Read($readbuffer,0,1024)
$rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)
logEntry 0 "Transfer complete" "Success"
# be sure to clean up after ourselves
$rs.Close()
#$rs.Dispose()
$sourceStream.Close()
#$sourceStream.Dispose()
}
$ftpResponse.Close()
Example of trying to trap the Transfer OK response at the end:
logEntry 0 "Starting file transfer" "Success"
# get the request stream, and write the bytes into it
$rs = $ftpRequest.GetRequestStream()
do{
$readlength = $sourceStream.Read($readbuffer,0,1024)
$rs.Write($readbuffer,0,$readlength)
} while ($readlength -ne 0)
$rs.Close()
#start-sleep -s 2
trap [Exception]{
$script:succeeded = $false
logEntry 1 "Check FTP Connection" $_.Exception.Message
continue
}
$ftpResponse = $ftpRequest.GetResponse()

Having hit a similar issue myself with RAM usage hitting the GB's uploading a 3MB file, I found that replacing:
$content = gc -en byte $sourceFile
With:
$content = [System.IO.File]::ReadAllBytes($sourceFile)
Gives much better performance. As mentioned elsewhere, chunking would be a better solution for really large files, as then you're not holding the whole file in memory at once, but the code above at least only consumes ~(size of file) bytes of RAM, which means it should be good up to the ~10s of MB kind of range.

Rather than read the whole file into memory using Get-Content, try reading it in a chunk at a time and writing it to the FTP request stream. I would use one of the lower level .NET file stream APIs to do the reading. Admittedly, you wouldn't think a 10MB would pose a memory problem though.
Also, make sure you get the response after geting the request stream and writing to it. The get of the response stream is what uploads the data. From the docs:
When using an FtpWebRequest object to
upload a file to a server, you must
write the file content to the request
stream obtained by calling the
GetRequestStream method or its
asynchronous counterparts, the
BeginGetRequestStream and
EndGetRequestStream methods. You must
write to the stream and close the
stream before sending the request.
Requests are sent to the server by
calling the GetResponse method or its
asynchronous counterparts, the
BeginGetResponse and EndGetResponse
methods. When the requested operation
completes, an FtpWebResponse object is
returned. The FtpWebResponse object
provides the status of the operation
and any data downloaded from the
server.

Related

Uploading a File via FTP using PowerShell

I am writing a Powershell script that watches a directory, and when a file (or multiple) is uploaded into the directory, it takes those files, copies them to another folder, sends them to an FTP server, and then deletes the file from the original directory.
I am having problems connecting to the FTP server. I am not sure if the problem is the way I am configuring the Web Client, or if the problem is that the ftp URI has spaces in it and I am not escaping them properly.
Here is the code:
$source = "c:/testFtp"
$ftpdestination = "ftp://username:password#ftp.ftpsite.com/folder with space/folder with space"
$webclient = New-Object -TypeName System.Net.WebClient
$files = Get-ChildItem $source
foreach ($file in $files)
{
Write-Host "Uploading $file"
try {
$ftp = "$ftpdestination/$file"
$uri = New-Object -TypeName System.Uri -ArgumentList $ftp
$webclient.UploadFile($uri, $source/$file)
} catch {
Add-content "$logs" -value "There was an error uploading to ftp"
}
}
$webclient.Dispose()
I have tried escaping the folder spaces multiple ways, so I am beginning to think that is not the problem and that I am not configuring the web client properly.
It is also not catching errors very often, so I don't believe it throws an error when the webclient has failure on the upload. Any help is appreciated!
ANSWER:
It turns out the WebClient was connecting properly, but SSL was blocking the files from being sent. I found this out by using C# to compile and run the script, and it gave me better error handling, as I am new to Powershell scripts and cannot seem to get good error handling.
After researching, I could not find a way to enable SSL with WebClient, so I switched over to FtpWebRequest. Here is the successful code (This try catch block does not seem to log errors as I would like, but the tool will successfully send files to the ftp server now:
try {
$ftp = [System.Net.FtpWebRequest]::Create("$ftpsite/$filename")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential("$username","$password")
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$ftp.EnableSSL = $true #<-----------------This was the line that made this work
$ftp.KeepAlive = $false
# read in the file to upload as a byte array
$content = [System.IO.File]::ReadAllBytes("$source/$file")
$ftp.ContentLength = $content.Length
# get the request stream, and write the bytes into it
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# be sure to clean up after ourselves
$rs.Close()
$rs.Dispose()
Write-Host "Successfully uploaded: $source/$file"
Copy-Item "$source/$file" -Destination "$copydestination"
Remove-Item "$source/$file"
$logline = "$(Get-Date), Added File: $file to $copydestination"
Add-content "$logs" -value $logline
} catch {
$res = $ftp.GetResponse()
Add-content "$logs" -value "There was an error: $res.StatusCode"
Write-Error -Message "There was an error." -ErrorAction Stop
}

Download files from set of folders on FTP/SFTP server with WinSCP .NET assembly in PowerShell and email results

I have this PowerShell script that I'm working on. CSV file is imported to get source and destination paths. The goal is to move files from a SFTP/FTP server into a destination and send an email report.
Task scheduler will run this code every hour. And if there's a new file, as email will be sent out.
It's almost done, but two things are missing:
Check if the file already exists and Body email seems empty: Getting the following error: Cannot validate argument on parameter 'Body'. The argument is null or empty. Provide an argument that is not
null or empty, and then try the command again.
I would like some assistance on how to check if the file exists and how to get this email if a new file was dropped and copied to the destination list
$SMTPBody = ""
$SMTPMessage = #{
"SMTPServer" = ""
"From" = ""
"To" = ""
"Subject" = "New File"
}
try {
# Load WinSCP .NET assembly
Add-Type -Path "C:\Program Files (x86)\WinSCP\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::sftp
HostName = ""
UserName = ""
Password = ""
PortNumber = "22"
FTPMode = ""
GiveUpSecurityAndAcceptAnySshHostKey = $true
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.TransferMode = [WinSCP.TransferMode]::Binary
Import-Csv -Path "D:\FILESOURCE.csv" -ErrorAction Stop | foreach {
$synchronizationResult = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, $_.Destination, $_.Source, $False)
$synchronizationResult.Check()
foreach ($download in $synchronizationResult.Downloads ) {
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
$SMTPBody +=
"`n Files: $($download.FileName -join ', ') `n" +
"Current Location: $($_.Destination)`n"
Send-MailMessage #SMTPMessage -Body $SMTPBody
}
$transferResult =
$session.GetFiles($_.Source, $_.Destination, $False, $transferOptions)
#Find the latest downloaded file
$latestTransfer =
$transferResult.Transfers |
Sort-Object -Property #{ Expression = { (Get-Item $_.Destination).LastWriteTime }
} -Descending |Select-Object -First 1
}
if ($latestTransfer -eq $Null) {
Write-Host "No files found."
$SMTPBody += "There are no new files at the moment"
}
else
{
$lastTimestamp = (Get-Item $latestTransfer.Destination).LastWriteTime
Write-Host (
"Downloaded $($transferResult.Transfers.Count) files, " +
"latest being $($latestTransfer.FileName) with timestamp $lastTimestamp.")
$SMTPBody += "file : $($latestTransfer)"
}
Write-Host "Waiting..."
Start-Sleep -Seconds 5
}
finally
{
Send-MailMessage #SMTPMessage -Body $SMTPBody
# Disconnect, clean up
$session.Dispose()
}
}
catch
{
Write-Host "Error: $($_.Exception.Message)"
}
I believe your code has more problems than you think.
Your combination of SynchronizeDirectories and GetFiles is suspicious. You first download only the new files by SynchronizeDirectories and then you download all files by GetFiles. I do not think you want that.
On any error the .Check call will throw and you will not collect the error into your report.
You keep sending partial reports by Send-MailMessage in the foreach loop
This is my take on your problem, hoping I've understood correctly what you want to implement:
$SMTPBody = ""
Import-Csv -Path "FILESOURCE.csv" -ErrorAction Stop | foreach {
Write-Host "$($_.Source) => $($_.Destination)"
$SMTPBody += "$($_.Source) => $($_.Destination)`n"
$synchronizationResult =
$session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, $_.Destination, $_.Source, $False)
$downloaded = #()
$failed = #()
$latestName = $Null
$latest = $Null
foreach ($download in $synchronizationResult.Downloads)
{
if ($download.Error -eq $Null)
{
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
$downloaded += $download.FileName
$ts = (Get-Item $download.Destination).LastWriteTime
if ($ts -gt $latest)
{
$latestName = $download.FileName;
$latest = $ts
}
}
else
{
Write-Host "File $($download.FileName) download failed" -ForegroundColor Red
$failed += $download.FileName
}
}
if ($downloaded.Count -eq 0)
{
$SMTPBody += "No new files were downloaded`n"
}
else
{
$SMTPBody +=
"Downloaded $($downloaded.Count) files:`n" +
($downloaded -join ", ") + "`n" +
"latest being $($latestName) with timestamp $latest.`n"
}
if ($failed.Count -gt 0)
{
$SMTPBody +=
"Failed to download $($failed.Count) files:`n" +
($failed -join ", ") + "`n"
}
$SMTPBody += "`n"
}
It will give you a report like:
/source1 => C:\dest1`
Downloaded 3 files:
/source1/aaa.txt, /source1/bbb.txt, /source1/ccc.txt
latest being /source1/ccc.txt with timestamp 01/29/2020 07:49:07.
/source2 => C:\dest2
Downloaded 1 files:
/source2/aaa.txt
latest being /source2/aaa.txt with timestamp 01/29/2020 07:22:37.
Failed to download 1 files:
/source2/bbb.txt
To check and make sure the csv file exists before you process the entire thing, you can use the Test-Path,
...
if (!(Test-Path D:\FileSource.csv)) {
Write-Output "No File Found"
$SMTPBody += "There are no new files at the moment"
return; # Dont run. file not found. Exit out.
}
Import-Csv -Path "D:\FILESOURCE.csv" -ErrorAction Stop | foreach {
...
and for the Body error you are getting, it is coming from the finally loop because there are cases where $SMTPBody would be null. This will no longer be an issue because $SMTPBody will have some text when file is not found at the beginning.
Even though you are using return in the if statement to check if the file exists, finally will always get executed. Since we updated $smtpbody, your Send-MailMessage will no longer error out.
Update
If you want to check if the file you are downloading already exists, you can use the if statement like this,
foreach ($download in $synchronizationResult.Downloads ) {
if (!(Test-Path Join-Path D: $download.FileName) {
$SMTPBody += "File $($download.Filename) already exists, skipping."
continue # will go to the next download...
}
Write-Host "File $($download.FileName) downloaded" -ForegroundColor Green
...
If you do get the error regarding body, thats mostly because your script came across an exception and was sent straight over to finally statement. Finally statement sends the email with empty body because it was never set (due to exception). I would recommend using the debugger (step through) and see which step causes the exception and look into adding steps to make sure script doesnt fail.

SharePoint 2013 Powershell - Copy File From One Site Collection To Another

Please can someone assist in helping with the above subject?
I would like to copy one file from a specific folder in a sharepoint site collection to another library (of the same name) in a different sharepoint site collection (but still within the same Web Application).
I have very little Powershell experience and have tried a number of Google searches but cannot seem to find anything that works.
Below is an example of what i have tried to do (lots of Write-Host to try and figure out what is going on) with the error message at the bottom.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$DestFolder = $dList.Files
$RootFolder = $sList.RootFolder
Write-Host " line 25 -- " $RootFolder
$collfiles1 = $RootFolder.Files
Write-Host " line 27 -- "$collfiles1
Write-Host " line 28 -- "$DestFolder
Write-Host " line 30 -- "$str = $DestinationWebURL"/"$DestinationLibraryTitle"/"$FileName
Write-Host " line 31 -- "$collfiles1.Count
for($i = 0 ; $i -lt $collfiles1.Count ; $i++)
{
Write-Host " line 34 -- "$collfiles1[$i].Name
##Write-Host $FileName
if($collfiles1[$i].Name -eq $FileName)
{
## $str = $DestinationWebURL.Url + $DestinationLibraryTitle + "/" + $FileName
$str = $DestinationWebURL+"/" +$DestinationLibraryTitle+"/"
Write-Host " line 40 -- "$str
Write-Host " line 41 -- "$collfiles1[$i]
$FiletoCopy = $collfiles1[$i].Name
Write-Host " line 43 -- " $FiletoCopy
$FiletoCopy.CopyTo($str,$true)
}
}
Write-Host "Script Completed"
The below example gives the error
Cannot find an overload for "CopyTo" and the argument count: "2".
At line:44 char:3
+ $FiletoCopy.CopyTo($str,$true)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodCountCouldNotFindBest
If someone could point me in the right direction that would be very helpful.
Thanks in advance,
Ian.
The following PowerShell for your reference, copy a file from one library in site collection to another library in another site collection with fields.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
#$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
#$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$SourceFile=$sWeb.GetFile($SourceWebURL+"/"+$SourceLibraryTitle+"/"+$FileName)
$TargetFolder = $dWeb.GetFolder($DestinationLibraryTitle)
#Copy File from the Source
$NewFile = $TargetFolder.Files.Add($SourceFile.Name, $SourceFile.OpenBinary(),$True)
#Copy Meta-Data from Source
Foreach($Field in $SourceFile.Item.Fields)
{
If(!$Field.ReadOnlyField)
{
if($NewFile.Item.Fields.ContainsField($Field.InternalName))
{
$NewFile.Item[$Field.InternalName] = $SourceFile.Item[$Field.InternalName]
}
}
}
#Update
$NewFile.Item.UpdateOverwriteVersion()
Write-host "Copied File:"$SourceFile.Name
Reference: Copy Files Between Document Libraries in SharePoint using PowerShell
So in case of large files where file size is greater than 50MB. This script mentioned by #LZ_MSFT will never be able to copy that file may be. In that aspect, you need to chunk the file into small pieces.Here is the PS to copy from source to destination with chunking if file size is greater than 50MB. Plus point for this one script is, it is using Client so it can be used with SP online and on-prem.
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
Function UploadFileInSlice ($DestinationCtx, $SourceCtx, $SourceFileUrl, $DestinationFolderUrl, $fileName, $fileChunkSizeInMB) {
# Each sliced upload requires a unique ID.
$UploadId = [GUID]::NewGuid()
# Get File by Server Relative URL
$File = $SourceCtx.Web.GetFileByServerRelativeUrl($SourceFileUrl)
$SourceCtx.Load($File)
# Get file Steam with OpenBinarySteam
$StreamToUpload = $File.OpenBinaryStream()
$SourceCtx.ExecuteQuery()
# File size in bytes
$FileSize = ($File).length
# Get Destination Folder by Server Relative URL
$DestinationFolder =
$DestinationContext.Web.GetFolderByServerRelativeUrl($DestinationFolderUrl)
$DestinationCtx.Load($DestinationFolder)
$DestinationCtx.ExecuteQuery()
# Set Complete Destination URL with Destination Folder + FileName
$destUrl = $DestinationFolderUrl + "/" + $fileName
# File object.
[Microsoft.SharePoint.Client.File] $upload
# Calculate block size in bytes.
$BlockSize = $fileChunkSizeInMB * 1000 * 1000
Write-Host "File Size is: $FileSize bytes and Chunking Size is:$BlockSize bytes"
if ($FileSize -le $BlockSize)
{
# Use regular approach if file size less than BlockSize
Write-Host "File uploading with out chunking"
$upload =[Microsoft.SharePoint.Client.File]::SaveBinaryDirect($DestinationCtx, $destUrl, $StreamToUpload.Value, $true)
return $upload
}
else
{
# Use large file upload approach.
$BytesUploaded = $null
$Fs = $null
Try {
$br = New-Object System.IO.BinaryReader($StreamToUpload.Value)
#$br = New-Object System.IO.BinaryReader($Fs)
$buffer = New-Object System.Byte[]($BlockSize)
$lastBuffer = $null
$fileoffset = 0
$totalBytesRead = 0
$bytesRead
$first = $true
$last = $false
# Read data from file system in blocks.
while(($bytesRead = $br.Read($buffer, 0, $buffer.Length)) -gt 0) {
$totalBytesRead = $totalBytesRead + $bytesRead
# You've reached the end of the file.
if($totalBytesRead -eq $FileSize) {
$last = $true
# Copy to a new buffer that has the correct size.
$lastBuffer = New-Object System.Byte[]($bytesRead)
[array]::Copy($buffer, 0, $lastBuffer, 0, $bytesRead)
}
If($first)
{
$ContentStream = New-Object System.IO.MemoryStream
# Add an empty file.
$fileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$fileCreationInfo.ContentStream = $ContentStream
$fileCreationInfo.Url = $fileName
$fileCreationInfo.Overwrite = $true
#Add file to Destination Folder with file creation info
$Upload = $DestinationFolder.Files.Add($fileCreationInfo)
$DestinationCtx.Load($Upload)
# Start upload by uploading the first slice.
$s = New-Object System.IO.MemoryStream(,$Buffer)
Write-Host "Uploading id is:"+$UploadId
# Call the start upload method on the first slice.
$BytesUploaded = $Upload.StartUpload($UploadId, $s)
$DestinationCtx.ExecuteQuery()
# fileoffset is the pointer where the next slice will be added.
$fileoffset = $BytesUploaded.Value
Write-Host "First patch of file with bytes"+ $fileoffset
# You can only start the upload once.
$first = $false
}
Else
{
# Get a reference to your file.
$Upload = $DestinationCtx.Web.GetFileByServerRelativeUrl($destUrl);
If($last) {
# Is this the last slice of data?
$s = New-Object System.IO.MemoryStream(,$lastBuffer)
# End sliced upload by calling FinishUpload.
$Upload = $Upload.FinishUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
Write-Host "File Upload Completed Successfully!"
# Return the file object for the uploaded file.
return $Upload
}
else {
$s = New-Object System.IO.MemoryStream(,$buffer)
# Continue sliced upload.
$BytesUploaded = $Upload.ContinueUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
# Update fileoffset for the next slice.
$fileoffset = $BytesUploaded.Value
Write-Host "File uploading is in progress with bytes: "+ $fileoffset
}
}
} #// while ((bytesRead = br.Read(buffer, 0, buffer.Length)) > 0)
}
Catch {
Write-Host $_.Exception.Message -ForegroundColor Red
}
Finally {
if ($Fs -ne $null)
{
$Fs.Dispose()
}
}
}
return $null
}
#URL to Configure, in this case Destination is SP Online site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$DestnationSiteUrl = "https://your-domain.sharepoint.com/sites/xyz"
$DestinationRelativeURL = "/sites/xyz/TestLibrary" #server relative URL here with library Name and Folder name
$DestinationUserName = "xyz#your-domain.com"
$DestinationPassword = Read-Host "Enter Password for Destination User:
$DestinationUserName" -AsSecureString
#URL to Configure, in this case Source is On-Prem site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$SourceSiteUrl = "http://intranet/sites/xyz"
$SourceRelativeURL = "/sites/xyz/TestLibrary/myfile.pptx" #server relative URL here with library Name and file name with extension
$SourceUsername = "domain\xyz"
$SourcePassword = Read-Host "Enter Password for Source User: $SourceUsername" -AsSecureString
#Set a file name with extension
$FileNameWithExt = "myfile.pptx"
#Get Source Client Context with credentials
$SourceContext = New-Object Microsoft.SharePoint.Client.ClientContext($SourceSiteUrl)
#Using NetworkCredentials in case of On-Prem
$SourceCtxcredentials = New-Object System.Net.NetworkCredential($SourceUsername, $SourcePassword)
$SourceContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$SourceContext.ExecuteQuery();
#Get Destination Client Context with credentials
$DestinationContext = New-Object Microsoft.SharePoint.Client.ClientContext($DestnationSiteUrl)
#Using SharePointOnlineCredentials in case of SP-Online
$DestinationContext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($DestinationUserName, $DestinationPassword)
$DestinationContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$DestinationContext.ExecuteQuery();
#All Set up, now just call the UploadFileInSlice with parameters
$UpFile = UploadFileInSlice -DestinationCtx $DestinationContext -SourceCtx $SourceContext -DestinationFolderUrl $DestinationRelativeURL -SourceFileUrl $SourceRelativeURL -fileName $FileNameWithExt -fileChunkSizeInMB 10

Powershell, System.Diagnostics.Process & exiftool stop working when dealing with hundreds of commands

I created a tool (to be precise: a Powershell script) that helps me with converting pictures in folders, i.e. it looks for all files of a certain ending (say, *.TIF) and converts them to JPEGs via ImageMagick. It then transfers some EXIF, IPTC and XMP information from the source image to the JPEG via exiftool:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# + + + + The problem occurs somewhere after this line + + + +
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
# no difference using start-sleep:
# Start-Sleep -Milliseconds 25
}
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolproc.StandardOutput.ReadToEnd()
$outputout = $outputout -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
If you want to reproduce but do not have/want that many files, there is also a simpler way: let exiftool print out its version number 600 times:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
for($i=0; $i -lt 600; $i++){
try{
$exiftoolproc.StandardInput.WriteLine("-ver`n-execute`n")
Write-Output "Success:`t$i"
}catch{
Write-Output "Failed:`t$i"
}
}
# close exiftool:
try{
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
}catch{
Write-Output "Could not close exiftool!"
}
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[array]$outputout = #($exiftoolproc.StandardOutput.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
As far as I could test, it all goes well, as long as you stay < 115 files. If you go above, the 114th JPEG gets proper metadata, but exiftool stops to work after this one - it idles, and my script does so, too. I can reproduce this with different files, paths, and exiftool commands.
Neither the StandardOutput nor the StandardError show any irregularities even with exiftool's -verbose-flag - of course, they would not, as I have to kill exiftool to get them to show up.
Running ISE's / VSCode's debugger shows nothing. Exiftool's window (only showing up when debugging) shows nothing.
Is there some hard limit on commands run with System.Diagnostics.Process, is this a problem with exiftool or is this simply due to my incompetence to use something outside the most basic Powershell cmdlets? Or maybe the better question would be: How can I properly debug this?
Powershell is 5.1, exiftool is 10.80 (production) - 10.94 (latest).
After messing around with different variants of $ArgList, I found out that there is no difference when using different file commands, but using commands that produce less StdOut (like -ver) resulted in more iterations. Therefore, I took an educated guess that the output buffer is the culprit.
As per Mark Byers' answer to "ProcessStartInfo hanging on “WaitForExit”? Why?":
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full. [...]
The solution is to use asynchronous reads to ensure that the buffer doesn't get full.
Then, it was just a matter of searching for the right things. I found that Alexander Obersht's answer to "How to capture process output asynchronously in powershell?" provides almost everything that I needed.
The script now looks like this:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
# + + + + NEW STUFF (1/2) HERE: + + + +
# Creating process object.
$exiftoolproc = New-Object -TypeName System.Diagnostics.Process
$exiftoolproc.StartInfo = $psi
# Creating string builders to store stdout and stderr.
$exiftoolStdOutBuilder = New-Object -TypeName System.Text.StringBuilder
$exiftoolStdErrBuilder = New-Object -TypeName System.Text.StringBuilder
# Adding event handers for stdout and stderr.
$exiftoolScripBlock = {
if (-not [String]::IsNullOrEmpty($EventArgs.Data)){
$Event.MessageData.AppendLine($EventArgs.Data)
}
}
$exiftoolStdOutEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'OutputDataReceived' -MessageData $exiftoolStdOutBuilder
$exiftoolStdErrEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'ErrorDataReceived' -MessageData $exiftoolStdErrBuilder
[Void]$exiftoolproc.Start()
$exiftoolproc.BeginOutputReadLine()
$exiftoolproc.BeginErrorReadLine()
# + + + + END OF NEW STUFF (1/2) + + + +
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
}
# + + + + NEW STUFF (2/2) HERE: + + + +
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
$exiftoolproc.WaitForExit()
# Unregistering events to retrieve process output.
Unregister-Event -SourceIdentifier $exiftoolStdOutEvent.Name
Unregister-Event -SourceIdentifier $exiftoolStdErrEvent.Name
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolStdErrBuilder.ToString().Trim().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolStdOutBuilder.ToString().Trim() -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
# + + + + END OF NEW STUFF (2/2) + + + +
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
I can confirm that it works for many, many files (at least 1600).

Getting error output from a powershell 2.0 script running as a task

TL:DR actual question is at the bottom
I'm trying to troubleshoot a Powershell v1.0 script issue. The script basically downloads a file from an FTP site and puts it on a remote server via UNC and emails the success or failure of the task.
The script runs as a task with a generic ID that is a Domain Admin but is not used to log into systems so the server it runs off of does not contain a profile for it.
If I do a runas for that user and execute the script via command line it works flawlessly. However, if I try to run it as a task it runs then exits instantly. If I open a runas command prompt and run the scheduled task vi at he command line all I get back is:
SUCCESS: Attempted to run the scheduled task "Task Name".
I've tried writing variable values to a text file to see what is going on but it never writes even when I write them as the very first step of execution.
What I want to do is capture any script error messages you would normally see when trying to run the script and/or write the variable information to a text file.
Is there any way to do this? BTW I doing via calling powershell with the following arguments:
-file -ExecutionPolicy Bypass "d:\datscript\myscript.ps1"
-I've tried -command instead of -file.
-I've tried "d:\datscript\myscript.ps1 5>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 9>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 | out-file d:\datscript\test.txt"
Nothing worked. I'm sure I can fix whatever bug I have but I'm banging my head against the wall trying to get some kind of failure info.
--Update: Here is a copy of the script minus details--
#-------------------------------------------------------------------------------------------------------------------------------------------------------------
#
#Variable Declaration
#
#$path = Path on local server to downlaod DAT to
#$olddat = Old/last DAT downloaded
#$currentdat = Next DAT number
#$ftpsite = McAfee FTP site. Update if path changes
#$ftpuser = FTP user (anon login)
#$ftppass = FTP password (anon login)
#$tempstring = Manipulation variable
#$gotdat = Boolean if updated DAT exists
#$success = Status if a new DAT exists and has been downloaded (used for email notification).
#$thetime = Variable use dto hold time of day manipulation.
$path = "\\myservername\ftproot\pub\mcafee\datfiles\"
$olddat = ""
$currentdat =""
$ftpsite = "ftp://ftp.nai.com/virusdefs/4.x/"
$ftpuser = "something"
$ftppass = "anything"
$tempstring =""
$gotdat = "False"
$success = ""
$thetime = ""
#
#Normalized functions handles UNC paths
#
function Get-NormalizedFileSystemPath
{
<#
.Synopsis
Normalizes file system paths.
.DESCRIPTION
Normalizes file system paths. This is similar to what the Resolve-Path cmdlet does, except Get-NormalizedFileSystemPath also properly handles UNC paths and converts 8.3 short names to long paths.
.PARAMETER Path
The path or paths to be normalized.
.PARAMETER IncludeProviderPrefix
If this switch is passed, normalized paths will be prefixed with 'FileSystem::'. This allows them to be reliably passed to cmdlets such as Get-Content, Get-Item, etc, regardless of Powershell's current location.
.EXAMPLE
Get-NormalizedFileSystemPath -Path '\\server\share\.\SomeFolder\..\SomeOtherFolder\File.txt'
Returns '\\server\share\SomeOtherFolder\File.txt'
.EXAMPLE
'\\server\c$\.\SomeFolder\..\PROGRA~1' | Get-NormalizedFileSystemPath -IncludeProviderPrefix
Assuming you can access the c$ share on \\server, and PROGRA~1 is the short name for "Program Files" (which is common), returns:
'FileSystem::\\server\c$\Program Files'
.INPUTS
String
.OUTPUTS
String
.NOTES
Paths passed to this command cannot contain wildcards; these will be treated as invalid characters by the .NET Framework classes which do the work of validating and normalizing the path.
.LINK
Resolve-Path
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('PSPath', 'FullName')]
[string[]]
$Path,
[switch]
$IncludeProviderPrefix
)
process
{
foreach ($_path in $Path)
{
$_resolved = $_path
if ($_resolved -match '^([^:]+)::')
{
$providerName = $matches[1]
if ($providerName -ne 'FileSystem')
{
Write-Error "Only FileSystem paths may be passed to Get-NormalizedFileSystemPath. Value '$_path' is for provider '$providerName'."
continue
}
$_resolved = $_resolved.Substring($matches[0].Length)
}
if (-not [System.IO.Path]::IsPathRooted($_resolved))
{
$_resolved = Join-Path -Path $PSCmdlet.SessionState.Path.CurrentFileSystemLocation -ChildPath $_resolved
}
try
{
$dirInfo = New-Object System.IO.DirectoryInfo($_resolved)
}
catch
{
$exception = $_.Exception
while ($null -ne $exception.InnerException)
{
$exception = $exception.InnerException
}
Write-Error "Value '$_path' could not be parsed as a FileSystem path: $($exception.Message)"
continue
}
$_resolved = $dirInfo.FullName
if ($IncludeProviderPrefix)
{
$_resolved = "FileSystem::$_resolved"
}
Write-Output $_resolved
}
} # process
} # function Get-NormalizedFileSystemPath
#
#Get the number of the exisiting DAT file and increment for next DAT if the DAT's age is older than today.
# Otherwise, exit the program if DATs age is today.
#
$tempstring = "xdat.exe"
$env:Path = $env:Path + ";d:\datscript"
$path2 ="d:\datscript\debug.txt"
add-content $path2 $path
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
$path = Get-NormalizedFileSystemPath -Path $path
Set-Location -Path $path
$olddat = dir $path | %{$_.Name.substring(0, 4) }
$olddatfull = "$olddat" + "$tempstring"
if ( ((get-date) - (ls $olddatfull).LastWriteTime).day -lt 1)
{
#***** Commented out for testing!
# exit
}
$currentdat = [INT] $olddat
$currentdat++
$currentdat = "$currentdat" + "$tempstring"
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
#
#Connect to FTP site and get a current directory listing.
#
[System.Net.FtpWebRequest]$ftp = [System.Net.WebRequest]::Create($ftpsite)
$ftp.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
#
# Read all the data available from the ftp directory stream, writing it to the
# output buffer when done. After that the buffer is searched to see if it cotains the expected
# lastest DAT.
#
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
} catch { $foundMore = $false; $read = 0 }
} while($read -gt 0)
} while($foundmore)
$gotdat = $outputbuffer.Contains($currentdat)
$target = $path + $currentdat
#
# Downloads DATs and cleans up old DAT file. Returns status of the operation.
# Return 1 = success
# Return 2 = Latest DAT not found and 4pm or later
# Return 3 = DAT available but did not download or is 0 bytes
# Return 4 = LatesT DAT not found and before 4pm
#
$success = 0
if ($gotdat -eq "True")
{
$ftpfile = $ftpsite + $ftppath + $currentdat
write-host $ftpfile
write-host $target
$ftpclient = New-Object system.Net.WebClient
$uri = New-Object System.Uri($ftpfile)
$ftpclient.DownloadFile($uri, $target)
Start-Sleep -s 30
if ( ((get-date) - (ls $target).LastWriteTime).days -ge 1)
{
$success = 3
}
else
{
$testlength = (get-item $target).length
if( (get-item $target).length -gt 0)
{
Remove-Item "$olddatfull"
$success = 1
}
else
{
$success = 3
}
}
}
else
{
$thetime = Get-Date
$thetime = $thetime.Hour
if ($thetime -ge 16)
{
$success = 2
}
else
{
$success = 4
exit
}
}
#
# If successful download (success = 1) run push bat
#
if ($success -eq 1)
{
Start-Process "cmd.exe" "/c c:\scripts\mcafeepush.bat"
}
#Email structure
#
#Sends result email based on previous determination
#
#SMTP server name
$smtpServer = "emailserver.domain.com"
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "email1#domain.com"
$msg.ReplyTo = "email2#domain.com"
$msg.To.Add("email2#domain.com")
switch ($success)
{
1 {
$msg.subject = "McAfee Dats $currentdat successful"
$msg.body = ("DAT download completed successfully. Automaton v1.0")
}
2 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("Looking for DAT $currentdat on the FTP site but I coud not find it. Human intervention may be required. Automaton v1.0")
}
3 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("$currentdat is available for download but download has failed. Human intervention will be required. Automaton v1.0")
}
default {
$msg.subject = "DAT Automaton Error"
$msg.body = ("Something broke with the McAfee automation script. Human intervention will be required. Automaton v1.0")
}
}
#Sending email
$smtp.Send($msg)
#Needed to keep the program from exiting too fast.
Start-Sleep -s 30
#debugging stuff
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
Apparently you have an error in starting Powershell, either because execution policy is different on the Powershell version you start, or on the account, or there is an access error on the scheduled task. To gather actual error, you can launch a task like so:
cmd /c "powershell.exe -file d:\datscript\myscript.ps1 test.txt 2>&1" >c:\windows\temp\test.log 2&>1
This way if there would be an error on starting Powershell, it will be logged in the c:\windows\temp\test.log file. If the issue is in execution policy, you can create and run (once) a task with the following:
powershell -command "Get-ExecutionPolicy -List | out-file c:/windows/temp/policy.txt; Set-ExecutionPolicy RemoteSigned -Scope LocalMachine -Force"
Running a task under the account you plan to run your main task will first get the policies in effect (so that if setting machine-level policy won't help, you'll know what scope to alter) and set machine-level policy to "RemoteSigned", the least restrictive level beyond allowing every script (highly not recommended, there are encoder scripts written on Powershell that can ruin your data).
Hope this helps.
UPDATE: If that's not policy, there might be some errors in properly writing the parameters for the task. You can do this: Create a .bat file with the string that launches your script and redirects output to say test1.txt, then change the scheduled task to cmd.exe -c launcher.bat >test2.txt, properly specifying the home folder. Run the task and review both files, at least one of them should contain an error that prevents your script from launching.