PowerShell Script to download an entire folder to FTP [duplicate] - powershell

I like to write a PowerShell script to download all files and subfolders from my FTP server. I found a script to download all files from one specific folder, but I also like to download the subfolders and their files.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://ftp.abc.ch/"
$user = 'abc'
$pass = 'abc'
$folder = '/'
$target = "C:\LocalData\Powershell\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.*"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Thanks for your help (:

The .NET framework or PowerShell do not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest or WebClient). The .NET framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
Though Microsoft does not recommend FtpWebRequest for a new development.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
The Session.GetFiles method is recursive by default.
(I'm the author of WinSCP)

For retrieving files /folder from FTP via powerShell I wrote some functions, you can get even hidden stuff from FTP.
Example for getting all files and subfolders (even hidden ones) in a specific folder:
Get-FtpChildItem -ftpFolderPath "ftp://myHost.com/root/leaf/" -userName "User" -password "pw" -Directory -File
You can just copy the functions from the following module without needing any 3rd library installed:
https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1

Related

Using WinSCP in PowerShell to compare FTP to local directories

Trying to use WinSCP and PowerShell to list the remote and local directories comparing the lowest level to then transfer the directories on the remote missing from the local.
Listing works for both remote and local but the comparison -contains operator shows false even though when I Write-Host for $fileInfo it looks like they should match. I tried adding $fileInfo to an array for the comparison but it has much more than just the lowest directory like the attributes and other stuff for the directories.
$localpath = "C:\Users"
$remotePath = "/home"
Add-Type -Path "WinSCPnet.dll"
$localfolders = Get-ChildItem $localpath
$localtargets = #()
$remotetargets = #()
foreach ($f in $localfolders){
$split = $f.FullName -split '\\'
$localtargets += $split[$split.Count-1]
}
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.com"
UserName = "user"
Password = "password"
}
$session = New-Object WinSCP.Session
try
{
$session.Open($sessionOptions)
$fileInfos = $session.ListDirectory($remotePath)
foreach ($fileInfo in $fileInfos.Files)
{
Write-Host $localtargets.Contains($fileInfo)
$remotetargets += $fileInfo | Out-String
}
}
As I understand $localtargets is just a list of strings representig file names, so you should just test if it contains the name of the file, in your last loop, can you test :
Write-Host $localtargets.Contains($fileInfo.Name)
The post by #JPBlack correctly answers your literal question.
Though note that WinSCP .NET assembly can synchronize your folder on its own using Session.SynchronizeDirectories:
$session.Open($sessionOptions)
$session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local, $localpath, $remotePath, $False).Check()
Or if you just want to find the differences, use the Session.CompareDirectories.

How to use PowerShell to download files from SharePoint?

I've used the following sites to help me get this far and to troubleshoot.
Download file from SharePoint
How to download newest file from SharePoint using PowerShell
Mike Smith's Tech Training Notes SharePoint, PowerShell and .Net!
Upload file to a SharePoint doc library via PowerShell
Download latest file from SharePoint Document Library
How to iterate each folders in each of the SharePoint websites using PowerShell
PowerShell's Get-ChildItem on SharePoint Library
I am trying to download random files from SharePoint folder, and I have it working for when I actually know the file name and extension.
Working code with name and extension:
$SharePoint = "https://Share.MyCompany.com/MyCustomer/WorkLoad.docx"
$Path = "$ScriptPath\$($CustomerName)_WorkLoad.docx"
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Download Files
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential($UserName, $Password)
$WebClient.DownloadFile($SharePoint, $Path)
However, I don't seem to be able to figure out how to do it with multiple files of unknown names or extensions.
I have tried mapping a drive, only to end up with "drive mapping failed" & "The network path was not found." errors:
$SharePoint = Read-Host 'Enter the full path to Delivery Site'
$LocalDrive = 'P:'
$Credentials = Get-Credential
if (!(Test-Path $LocalDrive -PathType Container)) {
$retrycount = 0; $completed = $false
while (-not $completed) {
Try {
if (!(Test-Path $LocalDrive -PathType Container)) {
(New-Object -ComObject WScript.Network).MapNetworkDrive($LocalDrive,$SharePoint,$false,$($Credentials.username),$($Credentials.GetNetworkCredential().password))
}
$Completed = $true
}
Catch {
if ($retrycount -ge '5') {
Write-Verbose "Mapping SharePoint drive failed the maximum number of times"
throw "SharePoint drive mapping failed for '$($SharePoint)': $($Global:Error[0].Exception.Message)"
} else {
Write-Verbose "Mapping SharePoint drive failed, retrying in 5 seconds."
Start-Sleep '5'
$retrycount++
}
}
}
}
I've also used the following code with similar results or no results at all.
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Gathering the location of the Card Formats and Destination folder
$Customer = "$SharePoint\MyCustomer"
$Products = "$Path\$($CustomerName)\Products\"
#Get Documents from SharePoint
$credential = New-Object System.Management.Automation.PSCredential($UserName, $Password)
New-PSDrive -Credential $credential -Name "A" -PSProvider "FileSystem" -Root "$SharePoint"
net use $spPath #$password /USER:$user#corporate
#Get PMDeliverables file objects recursively
Get-ChildItem -Path "$Customer" | Where-Object { $_.name -like 'MS*' } | Copy-Item -Destination $Products -Force -Verbose
Without defined "input parameters", it's not exactly clear the full solution you need so I'll provide a few snippets of PowerShell that should be of use based on what you've described.
I'll spare you the basics of the various OOTB functions (i.e. Get-SPWeb, etc) though can provide those details as well if needed. I've also been overly explicit in the scripting, though know some of these lines could be chained, piped, etc to be made shorter & more efficient.
This example will iterate over the contents of a SharePoint Library and download them to your local machine:
$Destination = "C:\YourDestinationFolder\ForFilesFromSP"
$Web = Get-SPWeb "https://YourServerRoot/sites/YourSiteCollection/YourSPWebURL"
$DocLib = $Web.Lists["Your Doc Library Name"]
$DocLibItems = $DocLib.Items
foreach ($DocLibItem in $DocLibItems) {
if($DocLibItem.Url -Like "*.docx") {
$File = $Web.GetFile($DocLibItem.Url)
$Binary = $File.OpenBinary()
$Stream = New-Object System.IO.FileStream($Destination + "\" + $File.Name), Create
$Writer = New-Object System.IO.BinaryWriter($Stream)
$Writer.write($Binary)
$Writer.Close()
}
}
This is pretty basic; the variables up top are where on your local machine you wish to store the download files ($Destination), the URL of your SharePoint Site/Web ($Web) and the name of the Document Library (Your Doc Library Name).
The script then iterates through the items in the Library (foreach ($DocLibItem in $DocLibItems) {}), optionally filters for say items with a .docx file extension and downloads each to your local machine.
You could customize this further by targeting a specific sub-folder within the Doc Library, filter by metadata or properties of the Docs or even iterate over multiple Sites, Webs and/or Libraries in one script, optionally filtering those based on similar properties.

Recursively downloading files from FTP server

Hi All i am new to power shell scripting, i am trying to download all files in root directory and files in sub folder based on file name from FTP server. As i am looking to download files which got matched on files names in like, as my path will always point to root directory and root directory is having text files and sub folders, the sub folders contains text files.
I found an example to get text files to download to local from ftp server, but the example i found is only downloading text files from root directory it is not scanning sub folders and not getting files in sub folders.
Here is the example i tried
# $url = "ftp://XXX.com/"
$user = 'UserName'
$pass = 'Password'
$folder = 'FTP_Folder'
$target = "C:\Folder\Folder1\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.txt"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
I tried a module PSFTP module functions like Get-ftpchilditem function with recursive, but it not worked for me.
Please help me in this to get files from root folder and sub folders of FTP server to local system.
I have done it in the past using WinSCP and SFTP. The example is below. This uses SFTP SCP which is basically over SSH. I am sure you can change the WInSCP protocol to ftp and try different combination. Hope this helps you. Change the FileMasks to only include .txt and this will go through sub folders. Also for this to work you will require WinSCPnet.dll and WinSCP.exe which you can easily download from WinSCP site. Download the ".Net Assembly/COM Library" and keep the script and the required files in the same folder and update the path in the script. Good Luck.
# Load WinSCP .NET assembly
Add-Type -Path "C:\Users\administrator.AD\Documents\WinSCP Script\WinSCPnet.dll"
$hostname = "ftp.xyz.net"
$user = "me"
$pass = "mypassword"
$sfolder = "/home/me"
$dfolder = "C:\tmp"
$sshHostKeyFingerprint = "ssh-rsa 2048 eb:2c:f9:ab:19:2e:0c:22:02:d4:8f:64:44:75:ec:04"
$mask = "*.jpg;*.jpeg;*.png;*.tiff;*.txt"
####################### END USER CONFIGURABLE SETTINGS #########################
# Main script
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Scp
HostName = $hostname
UserName = $user
Password = $pass
SshHostKeyFingerprint = $sshHostKeyFingerprint
}
$session = New-Object WinSCP.Session
$transferOptions = New-Object WinSCP.TransferOptions
$transferOptions.FileMask = "*.jpg; *.jpeg; *.png; *.tiff;"
try
{
# Will continuously report progress of synchronization
$session.add_FileTransferred( { FileTransferred($_) } )
# Connect
$session.Open($sessionOptions)
# Synchronize files
$synchronizationResult = $session.SynchronizeDirectories(
[WinSCP.SynchronizationMode]::Local,$dfolder, $sfolder, $False, $False, 1, $transferOptions)
# Throw on any error
$synchronizationResult.Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}

FTPS Upload in Powershell

I'm in the process of learning Powershell, and am working on a little script that will upload a group of files to an FTPS server nightly. The files are located on a network share in a sub-directory containing the date in the name. The files themselves will all begin with the same string, let's say "JONES_". I have this script working for FTP, but I don't quite get what I need to do to get it to work for FTPS:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
#Create Log File
$Logfile = "C:\powershell\$YDate.log"
Function LogWrite
{
Param ([string]$logstring)
Add-Content $Logfile -value $logstring
}
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\network\storage\location" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#ftp server
$ftp = "ftp://ftps.site.com"
$user = "USERNAME"
$pass = "PASSWORD"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
LogWrite "Uploading file: $YesterdayFolder\Report\$item"
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
}
} Else {
LogWrite "No files to upload"
}
I'd rather not have to deal with a 3rd party software solution, if at all possible.
Using psftp didn't work for me. I couldn't get it to connect to the FTP over SSL. I ended up (reluctantly?) using WinSCP with this code:
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USER:PASS#ftps.hostname.com:21/directory/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
In the foreach loop.
I'm not sure if you would consider this as "3rd party software" or not, but you can run PSFTP from within Powershell. Here is an example of how you could do that (source):
$outfile=$YesterdayFolder"\Report\"$item.Name
"rm $outfile`nput $outfile`nbye" | out-file batch.psftp -force -Encoding ASCII
$user = "USERNAME"
$pass = "PASSWORD"
&.\psftp.exe -l $user -pw $pass $ftp -b batch.psftp -be

PowerShell FTP download files and subfolders

I like to write a PowerShell script to download all files and subfolders from my FTP server. I found a script to download all files from one specific folder, but I also like to download the subfolders and their files.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://ftp.abc.ch/"
$user = 'abc'
$pass = 'abc'
$folder = '/'
$target = "C:\LocalData\Powershell\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.*"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Thanks for your help (:
The .NET framework or PowerShell do not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest or WebClient). The .NET framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
Though Microsoft does not recommend FtpWebRequest for a new development.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
The Session.GetFiles method is recursive by default.
(I'm the author of WinSCP)
For retrieving files /folder from FTP via powerShell I wrote some functions, you can get even hidden stuff from FTP.
Example for getting all files and subfolders (even hidden ones) in a specific folder:
Get-FtpChildItem -ftpFolderPath "ftp://myHost.com/root/leaf/" -userName "User" -password "pw" -Directory -File
You can just copy the functions from the following module without needing any 3rd library installed:
https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1