Using PowerShell to download all files in a directory - powershell

I have a working PowerShell script that iterates through a directory accessed via FTP and prints it's contents. The script looks like this:
$sourceuri = "<string>"
$targetpath = "<string>"
$username = "<string>"
$password = "<string>"
# Create a FTPWebRequest object to handle the connection to the ftp server
$ftprequest = [System.Net.FtpWebRequest]::create($sourceuri)
# set the request's network credentials for an authenticated connection
$ftprequest.Credentials = New-Object System.Net.NetworkCredential($username,$password)
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$ftprequest.UseBinary = $true
$ftprequest.KeepAlive = $false
# send the ftp request to the server
$ftpresponse = $ftprequest.GetResponse()
$stream = $ftpresponse.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
## Read all the data available from the stream, writing it to the
## output buffer when done.
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
}
catch
{
$foundMore = $false; $read = 0
}
}
while($read -gt 0)
}
while($foundmore)
$outputBuffer
My actual goal is not to just list the files but download them to the machine running the script. I'm finding this a bit tricky since my loops only reference bytes and not files by name. How can I use this loop to download all the files in a directory instead of just listing them?

I prefer use real FTP libraries like the one of winscp but there are other examples out there. http://winscp.net/eng/docs/library_session_listdirectory#example

once you have a the list of directories and files - use the get-content method to read from the list, and then use the bits transfer to download all files - I'm not sure if this will download the directories as well... but it will definitely download the files...

Related

PowerShell - Office 365 mailbox - get emails from specific folder

I need to get all emails from office 365 account, not from Inbox but from specific folder "Processed"
$mail="automate#company.com"
$password="pass"
# Set the path to your copy of EWS Managed API
$dllpath = "C:\Program Files\Microsoft\Exchange\Web Services\2.0\Microsoft.Exchange.WebServices.dll"
# Load the Assemply
[void][Reflection.Assembly]::LoadFile($dllpath)
# Create a new Exchange service object
$service = new-object Microsoft.Exchange.WebServices.Data.ExchangeService
#These are your O365 credentials
$Service.Credentials = New-Object Microsoft.Exchange.WebServices.Data.WebCredentials($mail,$password)
# this TestUrlCallback is purely a security check
$TestUrlCallback = {
param ([string] $url)
if ($url -eq "https://autodiscover-s.outlook.com/autodiscover/autodiscover.xml") {$true} else {$false}
}
# Autodiscover using the mail address set above
$service.AutodiscoverUrl($mail,$TestUrlCallback)
# create Property Set to include body and header of email
$PropertySet = New-Object Microsoft.Exchange.WebServices.Data.PropertySet([Microsoft.Exchange.WebServices.Data.BasePropertySet]::FirstClassProperties)
# set email body to text
$PropertySet.RequestedBodyType = [Microsoft.Exchange.WebServices.Data.BodyType]::Text;
# Set how many emails we want to read at a time
$numOfEmailsToRead = 100
# Index to keep track of where we are up to. Set to 0 initially.
$index = 0
# Do/while loop for paging through the folder
do
{
# Set what we want to retrieve from the folder. This will grab the first $pagesize emails
$view = New-Object Microsoft.Exchange.WebServices.Data.ItemView($numOfEmailsToRead,$index)
# Retrieve the data from the folder
# $findResults = $service.FindItems([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Processed,$view)
$findResults = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo([Microsoft.Exchange.WebServices.Data.FolderSchema]::DisplayName,"Processed")
}
But $findResults variable is empty, although there are emails in that folder, is there any way to get emails from specific folder in Office 365 ?
Found solution, this post helped me a lot
$mail="mail#company.com"
$password="pass"
$USER_DEFINED_FOLDER_IN_MAILBOX = "Processed"
# Set the path to your copy of EWS Managed API
$dllpath = "C:\Program Files\Microsoft\Exchange\Web Services\2.0\Microsoft.Exchange.WebServices.dll"
# Load the Assemply
[void][Reflection.Assembly]::LoadFile($dllpath)
# Create a new Exchange service object
$service = new-object Microsoft.Exchange.WebServices.Data.ExchangeService
#These are your O365 credentials
$Service.Credentials = New-Object Microsoft.Exchange.WebServices.Data.WebCredentials($mail,$password)
# this TestUrlCallback is purely a security check
$TestUrlCallback = {
param ([string] $url)
if ($url -eq "https://autodiscover-s.outlook.com/autodiscover/autodiscover.xml") {$true} else {$false}
}
# Autodiscover using the mail address set above
$service.AutodiscoverUrl($mail,$TestUrlCallback)
# get a handle to the inbox
$inbox = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,[Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Inbox)
$MailboxRootid = new-object Microsoft.Exchange.WebServices.Data.FolderId([Microsoft.Exchange.WebServices.Data.WellKnownFolderName]::Root, $email) # selection and creation of new root
$MailboxRoot = [Microsoft.Exchange.WebServices.Data.Folder]::Bind($service,$MailboxRootid)
$fvFolderView = new-object Microsoft.Exchange.WebServices.Data.FolderView(100) #page size for displayed folders
$fvFolderView.Traversal = [Microsoft.Exchange.WebServices.Data.FolderTraversal]::Deep; #Search traversal selection Deep = recursively
$SfSearchFilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+IsEqualTo([Microsoft.Exchange.WebServices.Data.FolderSchema]::Displayname,$USER_DEFINED_FOLDER_IN_MAILBOX) #for each folder in mailbox define search
$findFolderResults = $MailboxRoot.FindFolders($SfSearchFilter,$fvFolderView)
$ArchiveFolder = ""
# create Property Set to include body and header of email
$PropertySet = New-Object Microsoft.Exchange.WebServices.Data.PropertySet([Microsoft.Exchange.WebServices.Data.BasePropertySet]::FirstClassProperties)
# set email body to text
$PropertySet.RequestedBodyType = [Microsoft.Exchange.WebServices.Data.BodyType]::Text;
# This next loop successfully finds my folder, but it is an inefficient way
# to do it. It's ok, because there's not that many folders, but there's tens
# of thousands of emails to search through in the folder itself, and that will
# need a more efficient search.
foreach ($Fdr in $findFolderResults.Folders)
{
$theDisplayName = $Fdr.DisplayName
if($theDisplayName -eq $USER_DEFINED_FOLDER_IN_MAILBOX)
{
$ArchiveFolder = $Fdr
}
}
# Now to actually try and search through the emails in my $ArchiveFolder (the hard way)
$textToFindInSubject = "Remove"
$emailsInFolder = $ArchiveFolder.FindItems(9999) # <-- Successfully finds ALL emails with no filtering, requiring iterative code to find the ones I want.
foreach($individualEmail in $emailsInFolder.Items)
{
if($individualEmail.Subject -match "$textToFindInSubject")
{
# found the email i want - but a super inefficient
# way to do it
echo "Successfully found the email!"
}
}
$searchfilter = new-object Microsoft.Exchange.WebServices.Data.SearchFilter+ContainsSubstring([Microsoft.Exchange.WebServices.Data.EmailMessageSchema]::Subject,$textToFindInSubject)
$itemView = new-object Microsoft.Exchange.WebServices.Data.ItemView(999)
$searchResults = $service.FindItems($ArchiveFolder.ID, $searchfilter, $itemView)
foreach($result in $searchResults)
{
$result.Load($PropertySet)
$subj = $result.Subject
echo "Subject"$subj
echo "Body: $($result.Body.Text)"
}

SharePoint 2013 Powershell - Copy File From One Site Collection To Another

Please can someone assist in helping with the above subject?
I would like to copy one file from a specific folder in a sharepoint site collection to another library (of the same name) in a different sharepoint site collection (but still within the same Web Application).
I have very little Powershell experience and have tried a number of Google searches but cannot seem to find anything that works.
Below is an example of what i have tried to do (lots of Write-Host to try and figure out what is going on) with the error message at the bottom.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$DestFolder = $dList.Files
$RootFolder = $sList.RootFolder
Write-Host " line 25 -- " $RootFolder
$collfiles1 = $RootFolder.Files
Write-Host " line 27 -- "$collfiles1
Write-Host " line 28 -- "$DestFolder
Write-Host " line 30 -- "$str = $DestinationWebURL"/"$DestinationLibraryTitle"/"$FileName
Write-Host " line 31 -- "$collfiles1.Count
for($i = 0 ; $i -lt $collfiles1.Count ; $i++)
{
Write-Host " line 34 -- "$collfiles1[$i].Name
##Write-Host $FileName
if($collfiles1[$i].Name -eq $FileName)
{
## $str = $DestinationWebURL.Url + $DestinationLibraryTitle + "/" + $FileName
$str = $DestinationWebURL+"/" +$DestinationLibraryTitle+"/"
Write-Host " line 40 -- "$str
Write-Host " line 41 -- "$collfiles1[$i]
$FiletoCopy = $collfiles1[$i].Name
Write-Host " line 43 -- " $FiletoCopy
$FiletoCopy.CopyTo($str,$true)
}
}
Write-Host "Script Completed"
The below example gives the error
Cannot find an overload for "CopyTo" and the argument count: "2".
At line:44 char:3
+ $FiletoCopy.CopyTo($str,$true)
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodException
+ FullyQualifiedErrorId : MethodCountCouldNotFindBest
If someone could point me in the right direction that would be very helpful.
Thanks in advance,
Ian.
The following PowerShell for your reference, copy a file from one library in site collection to another library in another site collection with fields.
Add-PSSnapIn "Microsoft.SharePoint.PowerShell"
##
#Set Static Variables
##
$SourceWebURL = "http://WebAppURL/sites/Area/Master"
$SourceLibraryTitle = "Web"
$DestinationWebURL = "http://WebAppURL/sites/OtherSiteName"
$DestinationLibraryTitle = "Web"
$FileName = "Resources.aspx"
##
#Begin Script
##
$sWeb = Get-SPWeb $SourceWebURL
#$sList = $sWeb.Lists | ? {$_.Title -eq $SourceLibraryTitle}
$dWeb = Get-SPWeb $DestinationWebURL
#$dList = $dWeb.Lists | ? {$_.title -like $DestinationLibraryTitle}
$SourceFile=$sWeb.GetFile($SourceWebURL+"/"+$SourceLibraryTitle+"/"+$FileName)
$TargetFolder = $dWeb.GetFolder($DestinationLibraryTitle)
#Copy File from the Source
$NewFile = $TargetFolder.Files.Add($SourceFile.Name, $SourceFile.OpenBinary(),$True)
#Copy Meta-Data from Source
Foreach($Field in $SourceFile.Item.Fields)
{
If(!$Field.ReadOnlyField)
{
if($NewFile.Item.Fields.ContainsField($Field.InternalName))
{
$NewFile.Item[$Field.InternalName] = $SourceFile.Item[$Field.InternalName]
}
}
}
#Update
$NewFile.Item.UpdateOverwriteVersion()
Write-host "Copied File:"$SourceFile.Name
Reference: Copy Files Between Document Libraries in SharePoint using PowerShell
So in case of large files where file size is greater than 50MB. This script mentioned by #LZ_MSFT will never be able to copy that file may be. In that aspect, you need to chunk the file into small pieces.Here is the PS to copy from source to destination with chunking if file size is greater than 50MB. Plus point for this one script is, it is using Client so it can be used with SP online and on-prem.
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.dll"
Add-Type –Path "C:\Program Files\Common Files\microsoft shared\Web Server Extensions\16\ISAPI\Microsoft.SharePoint.Client.Runtime.dll"
Function UploadFileInSlice ($DestinationCtx, $SourceCtx, $SourceFileUrl, $DestinationFolderUrl, $fileName, $fileChunkSizeInMB) {
# Each sliced upload requires a unique ID.
$UploadId = [GUID]::NewGuid()
# Get File by Server Relative URL
$File = $SourceCtx.Web.GetFileByServerRelativeUrl($SourceFileUrl)
$SourceCtx.Load($File)
# Get file Steam with OpenBinarySteam
$StreamToUpload = $File.OpenBinaryStream()
$SourceCtx.ExecuteQuery()
# File size in bytes
$FileSize = ($File).length
# Get Destination Folder by Server Relative URL
$DestinationFolder =
$DestinationContext.Web.GetFolderByServerRelativeUrl($DestinationFolderUrl)
$DestinationCtx.Load($DestinationFolder)
$DestinationCtx.ExecuteQuery()
# Set Complete Destination URL with Destination Folder + FileName
$destUrl = $DestinationFolderUrl + "/" + $fileName
# File object.
[Microsoft.SharePoint.Client.File] $upload
# Calculate block size in bytes.
$BlockSize = $fileChunkSizeInMB * 1000 * 1000
Write-Host "File Size is: $FileSize bytes and Chunking Size is:$BlockSize bytes"
if ($FileSize -le $BlockSize)
{
# Use regular approach if file size less than BlockSize
Write-Host "File uploading with out chunking"
$upload =[Microsoft.SharePoint.Client.File]::SaveBinaryDirect($DestinationCtx, $destUrl, $StreamToUpload.Value, $true)
return $upload
}
else
{
# Use large file upload approach.
$BytesUploaded = $null
$Fs = $null
Try {
$br = New-Object System.IO.BinaryReader($StreamToUpload.Value)
#$br = New-Object System.IO.BinaryReader($Fs)
$buffer = New-Object System.Byte[]($BlockSize)
$lastBuffer = $null
$fileoffset = 0
$totalBytesRead = 0
$bytesRead
$first = $true
$last = $false
# Read data from file system in blocks.
while(($bytesRead = $br.Read($buffer, 0, $buffer.Length)) -gt 0) {
$totalBytesRead = $totalBytesRead + $bytesRead
# You've reached the end of the file.
if($totalBytesRead -eq $FileSize) {
$last = $true
# Copy to a new buffer that has the correct size.
$lastBuffer = New-Object System.Byte[]($bytesRead)
[array]::Copy($buffer, 0, $lastBuffer, 0, $bytesRead)
}
If($first)
{
$ContentStream = New-Object System.IO.MemoryStream
# Add an empty file.
$fileCreationInfo = New-Object Microsoft.SharePoint.Client.FileCreationInformation
$fileCreationInfo.ContentStream = $ContentStream
$fileCreationInfo.Url = $fileName
$fileCreationInfo.Overwrite = $true
#Add file to Destination Folder with file creation info
$Upload = $DestinationFolder.Files.Add($fileCreationInfo)
$DestinationCtx.Load($Upload)
# Start upload by uploading the first slice.
$s = New-Object System.IO.MemoryStream(,$Buffer)
Write-Host "Uploading id is:"+$UploadId
# Call the start upload method on the first slice.
$BytesUploaded = $Upload.StartUpload($UploadId, $s)
$DestinationCtx.ExecuteQuery()
# fileoffset is the pointer where the next slice will be added.
$fileoffset = $BytesUploaded.Value
Write-Host "First patch of file with bytes"+ $fileoffset
# You can only start the upload once.
$first = $false
}
Else
{
# Get a reference to your file.
$Upload = $DestinationCtx.Web.GetFileByServerRelativeUrl($destUrl);
If($last) {
# Is this the last slice of data?
$s = New-Object System.IO.MemoryStream(,$lastBuffer)
# End sliced upload by calling FinishUpload.
$Upload = $Upload.FinishUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
Write-Host "File Upload Completed Successfully!"
# Return the file object for the uploaded file.
return $Upload
}
else {
$s = New-Object System.IO.MemoryStream(,$buffer)
# Continue sliced upload.
$BytesUploaded = $Upload.ContinueUpload($UploadId, $fileoffset, $s)
$DestinationCtx.ExecuteQuery()
# Update fileoffset for the next slice.
$fileoffset = $BytesUploaded.Value
Write-Host "File uploading is in progress with bytes: "+ $fileoffset
}
}
} #// while ((bytesRead = br.Read(buffer, 0, buffer.Length)) > 0)
}
Catch {
Write-Host $_.Exception.Message -ForegroundColor Red
}
Finally {
if ($Fs -ne $null)
{
$Fs.Dispose()
}
}
}
return $null
}
#URL to Configure, in this case Destination is SP Online site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$DestnationSiteUrl = "https://your-domain.sharepoint.com/sites/xyz"
$DestinationRelativeURL = "/sites/xyz/TestLibrary" #server relative URL here with library Name and Folder name
$DestinationUserName = "xyz#your-domain.com"
$DestinationPassword = Read-Host "Enter Password for Destination User:
$DestinationUserName" -AsSecureString
#URL to Configure, in this case Source is On-Prem site URL
#Adding up credentials hard-code, you can use Get-Credentails PS command too
$SourceSiteUrl = "http://intranet/sites/xyz"
$SourceRelativeURL = "/sites/xyz/TestLibrary/myfile.pptx" #server relative URL here with library Name and file name with extension
$SourceUsername = "domain\xyz"
$SourcePassword = Read-Host "Enter Password for Source User: $SourceUsername" -AsSecureString
#Set a file name with extension
$FileNameWithExt = "myfile.pptx"
#Get Source Client Context with credentials
$SourceContext = New-Object Microsoft.SharePoint.Client.ClientContext($SourceSiteUrl)
#Using NetworkCredentials in case of On-Prem
$SourceCtxcredentials = New-Object System.Net.NetworkCredential($SourceUsername, $SourcePassword)
$SourceContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$SourceContext.ExecuteQuery();
#Get Destination Client Context with credentials
$DestinationContext = New-Object Microsoft.SharePoint.Client.ClientContext($DestnationSiteUrl)
#Using SharePointOnlineCredentials in case of SP-Online
$DestinationContext.Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($DestinationUserName, $DestinationPassword)
$DestinationContext.RequestTimeout = [System.Threading.Timeout]::Infinite
$DestinationContext.ExecuteQuery();
#All Set up, now just call the UploadFileInSlice with parameters
$UpFile = UploadFileInSlice -DestinationCtx $DestinationContext -SourceCtx $SourceContext -DestinationFolderUrl $DestinationRelativeURL -SourceFileUrl $SourceRelativeURL -fileName $FileNameWithExt -fileChunkSizeInMB 10

Powershell read file in chunks

I've had a script written in Powershell which transferred a file via FTP which worked fine by using:
$content = [System.IO.File]::ReadAllBytes($backup_app_data)
But this stopped working once the file size reached 2Gb, by throwing an error saying that this method is limited to reading files up to this size.
Exception calling "ReadAllBytes" with "1" argument(s): "The file is
too long. This operation is currently limited to supporting files less
than 2 gigabytes in size.
Now, I'm trying to modify my script and use an alternate approach with Read, which from what I understand will allow me to read the file in chunks and then write those chunks and so on.
Unfortunately although the script I've modified seems to be working, since a file is created on my FTP location, there is no data being transferred and no error is thrown by the script.
There is just a 0kb file created in the destination folder and the script ends.
I've tried to do some debugging, but I can't seem to find the issue. Below is the code I'm currently using.
$file = 'G:\Backups\DATA_backup_2016_09_05.zip'
$dest_name = 'DATA_backup_2016_09_05.zip'
$ftp = [System.Net.FtpWebRequest]::Create($ftp+$dest_name)
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user, $pass)
$ftp.UseBinary = $true
$ftp.UsePassive = $true
# determine the size of the file
$file_size = (Get-Item $file).length
$chunk_size = 512mb
$bytes_to_read = $file_size
$iterations = $file_size / $chunk_size
$iter = 0
$fstream = [System.IO.FileStream]
[byte[]] $byte_array
while ($bytes_to_read > 0){
if($iterations > 1) {
$content = $fstream.Read($byte_array, $iter, $chunk_size)
}
else {
$content = $fstream.Read($byte_array, $iter, $bytes_to_read)
}
$ftp.ContentLength = $content.Length
$rs = $ftp.GetRequestStream()
$rs.Write($content, 0, $content.Length)
# keep the loop going
$iter = $iter + 1
$iterations = $iterations - 1
$bytes_to_read = $bytes_to_read - $chunk_size
}
$rs.Close()
$rs.Dispose()
Any help is appreciated.
So, based on the suggestions in the comment section and using as an example the answer from the possible duplicate question, I managed to find the solution myself.
Below is the code I used to transfer via FTP a .zip archive which was 3.74Gb.
$ftp_addr = "ftp://ftp.somerandomwebsite.com/folder1/"
$user = "usr"
$pass = "passwrd"
$bufSize = 256mb
# ... missing code where I identify the file I want to FTP ... #
# Initialize connection to FTP
$ftp = [System.Net.FtpWebRequest]::Create($ftp_addr + $backup_file + ".zip")
$ftp = [System.Net.FtpWebRequest]$ftp
$ftp.Method = [System.Net.WebRequestMethods+Ftp]::UploadFile
$ftp.Credentials = new-object System.Net.NetworkCredential($user, $pass)
$ftp.Timeout = -1 #infinite timeout
$ftp.ReadWriteTimeout = -1 #infinite timeout
$ftp.UseBinary = $true
$ftp.UsePassive = $true
$requestStream = $ftp.GetRequestStream()
$fileStream = [System.IO.File]::OpenRead($file_to_ftp)
$chunk = New-Object byte[] $bufSize
while ( $bytesRead = $fileStream.Read($chunk, 0, $bufSize) ){
$requestStream.write($chunk, 0, $bytesRead)
$requestStream.Flush()
}
$fileStream.Close()
$requestStream.Close()
So far, this code has worked flawlessly for about multiple (20+) attempts. I hope it helps others!

folders downloading as files in ftp

So in my powershell script when it starts up it polls a ftp server and downloads any files that aren't in the local folder. The problem is when it gets to a folders it downloads them as files. this is my code for checking for new files:
$LocFolder = 'C:\EMSDropBox\*'
Remove-Item $LocFolder
$ftprequest = [System.Net.FtpWebRequest]::Create("ftp://NZHQFTP1/tbarnes")
$ftprequest.Proxy = $null
$ftprequest.KeepAlive = $false
$ftprequest.TimeOut = 10000000
$ftprequest.UsePassive = $False
$ftprequest.Credentials = New-Object System.Net.NetworkCredential("tbarnes", "Static_flow2290")
$ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectory
$FTPResponse = $ftprequest.GetResponse()
$ResponseStream = $FTPResponse.GetResponseStream()
$FTPReader = New-Object System.IO.Streamreader($ResponseStream)
$filename = $FTPReader.ReadLine()
while($filename -ne $null)
{
try
{
if((Test-Path ("C:\emsdropbox\"+$filename)) -ne $true)
{
downloadFtp($filename)
}
$filename = $FTPReader.ReadLine()
}
catch
{
Write-Host $_
}
}
$FTPReader.Close()
$FTPResponse.Close()
$ResponseStream.Close()
and this is the downloadFtp function:
# FTP Config
$FTPHost = "****"
$Username = "******"
$Password = "*********"
$FTPFile = $file
# FTP Log File Url
$FTPFileUrl = "ftp://" + $FTPHost + "/tbarnes/" + $FTPFile
# Create FTP Connection
$FTPRequest = [System.Net.FtpWebRequest]::Create("$FTPFileUrl")
$FTPRequest.Credentials = New-Object System.Net.NetworkCredential($Username, $Password)
$FTPRequest.Method = [System.Net.WebRequestMethods+Ftp]::DownloadFile
$FTPRequest.UsePassive = $false
$FTPRequest.UseBinary = $true
$FTPRequest.KeepAlive = $false
$targetfile = New-Object IO.FileStream (("C:\emsdropbox\"+$file),[IO.FileMode]::Create)
# Get FTP File
$FTPResponse = $FTPRequest.GetResponse()
$ResponseStream = $FTPResponse.GetResponseStream()
$FTPReader = New-Object -typename System.IO.StreamReader -ArgumentList $ResponseStream
[byte[]]$readbuffer = New-Object byte[] 1024
#loop through the download stream and send the data to the target file
do{
$readlength = $ResponseStream.Read($readbuffer,0,1024)
$targetfile.Write($readbuffer,0,$readlength)
}
while ($readlength -ne 0)
$FTPReader.Close()
Im not sure why it wont pull them down as folders so any help or pointers would be great!
The FTP methods don't support downloading of folders, or recursion, by themselves, so there's no other way I can think of doing this but what I've suggested below.
Change the method so you can differentiate between files and directories and handle them accordingly.
Change $ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectory to $ftprequest.Method = [System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
Which will list in this format:
-rwxrwxrwx 1 owner group 277632 Mar 4 17:15 xml_socket.log.rar
Directories will have a d in place of the - at the start of the line.
Here is an amended try block that will match only files so you can pass to the downloadFtp function:
try
{
if(!($filename -match '^d')){if((Test-Path ("C:\emsdropbox\"+$filename.Split(" ")[8])) -ne $true)
{
downloadFtp(($filename -split '\s+')[8])
}}
$filename = $FTPReader.ReadLine()
}
If you want to then get a list of directories, use the following try block against the same ftpresponse stream and for each, ListDirectoryDetails to get the list of files in each directory to process them:
try
{
if($filename -match '^d'){if((Test-Path ("C:\emsdropbox\"+$filename.Split(" ")[8])) -ne $true)
{
listdir(($filename -split '\s+')[8])
}}
$filename = $FTPReader.ReadLine()
}
You may also have to create the local directories too which you can do in powershell as follows:
New-Item c:\emsdropbox\newdir -type directory

ftp via powershell - how to indicate success

I'm using a Powershell script to automate sending a .txt to an FTP site. When I execute it in powershell, nothing happens. The root\prompt just appears...no messages that it was successful. How do I tell if it worked? Here is my script in case it helps.
$localfile = "D:\Export\TESTING.txt"
$remotefile = "/TESTING.txt"
$ftphost = "ftp://ftp.site.com"
$URI = $ftphost + $remotefile
$username="USERNAME"
$password="1234"
function Get-FTPFile
($URI,$localfile,$username,$password){
$credentials=New-Object System.Net.NetworkCredential
($username,$password)
$ftp=[System.Net.FtpWebRequest]::Create($URI)
$ftp.Credentials=$credentials
$ftp.UseBinary=1
$ftp.KeepAlive=0
$response=$ftp.GetResponse()
$responseStream = $response.GetResponseStream()
$file = New-Object
IO.FileStream ($localfile,[IO.FileMode]::Create)
[byte[]]$buffer = New-Object byte[] 1024
$read = 0
do{
$read=$responseStream.Read($buffer,0,1024)
$file.Write($buffer,0,$read)
}
while ($read -ne 0)$file.close()
}
Use WebRequestMethods.Ftp.GetFileSize following completion of the upload to confirm that the uploaded file size matches the local file size.
You could also use a try/catch block to check for exceptions during your read/write operations. A lack of exceptions would give you some confidence that the upload was successful (i.e. no news is good news).