How to use PowerShell to download files from SharePoint? - powershell

I've used the following sites to help me get this far and to troubleshoot.
Download file from SharePoint
How to download newest file from SharePoint using PowerShell
Mike Smith's Tech Training Notes SharePoint, PowerShell and .Net!
Upload file to a SharePoint doc library via PowerShell
Download latest file from SharePoint Document Library
How to iterate each folders in each of the SharePoint websites using PowerShell
PowerShell's Get-ChildItem on SharePoint Library
I am trying to download random files from SharePoint folder, and I have it working for when I actually know the file name and extension.
Working code with name and extension:
$SharePoint = "https://Share.MyCompany.com/MyCustomer/WorkLoad.docx"
$Path = "$ScriptPath\$($CustomerName)_WorkLoad.docx"
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Download Files
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential($UserName, $Password)
$WebClient.DownloadFile($SharePoint, $Path)
However, I don't seem to be able to figure out how to do it with multiple files of unknown names or extensions.
I have tried mapping a drive, only to end up with "drive mapping failed" & "The network path was not found." errors:
$SharePoint = Read-Host 'Enter the full path to Delivery Site'
$LocalDrive = 'P:'
$Credentials = Get-Credential
if (!(Test-Path $LocalDrive -PathType Container)) {
$retrycount = 0; $completed = $false
while (-not $completed) {
Try {
if (!(Test-Path $LocalDrive -PathType Container)) {
(New-Object -ComObject WScript.Network).MapNetworkDrive($LocalDrive,$SharePoint,$false,$($Credentials.username),$($Credentials.GetNetworkCredential().password))
}
$Completed = $true
}
Catch {
if ($retrycount -ge '5') {
Write-Verbose "Mapping SharePoint drive failed the maximum number of times"
throw "SharePoint drive mapping failed for '$($SharePoint)': $($Global:Error[0].Exception.Message)"
} else {
Write-Verbose "Mapping SharePoint drive failed, retrying in 5 seconds."
Start-Sleep '5'
$retrycount++
}
}
}
}
I've also used the following code with similar results or no results at all.
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Gathering the location of the Card Formats and Destination folder
$Customer = "$SharePoint\MyCustomer"
$Products = "$Path\$($CustomerName)\Products\"
#Get Documents from SharePoint
$credential = New-Object System.Management.Automation.PSCredential($UserName, $Password)
New-PSDrive -Credential $credential -Name "A" -PSProvider "FileSystem" -Root "$SharePoint"
net use $spPath #$password /USER:$user#corporate
#Get PMDeliverables file objects recursively
Get-ChildItem -Path "$Customer" | Where-Object { $_.name -like 'MS*' } | Copy-Item -Destination $Products -Force -Verbose

Without defined "input parameters", it's not exactly clear the full solution you need so I'll provide a few snippets of PowerShell that should be of use based on what you've described.
I'll spare you the basics of the various OOTB functions (i.e. Get-SPWeb, etc) though can provide those details as well if needed. I've also been overly explicit in the scripting, though know some of these lines could be chained, piped, etc to be made shorter & more efficient.
This example will iterate over the contents of a SharePoint Library and download them to your local machine:
$Destination = "C:\YourDestinationFolder\ForFilesFromSP"
$Web = Get-SPWeb "https://YourServerRoot/sites/YourSiteCollection/YourSPWebURL"
$DocLib = $Web.Lists["Your Doc Library Name"]
$DocLibItems = $DocLib.Items
foreach ($DocLibItem in $DocLibItems) {
if($DocLibItem.Url -Like "*.docx") {
$File = $Web.GetFile($DocLibItem.Url)
$Binary = $File.OpenBinary()
$Stream = New-Object System.IO.FileStream($Destination + "\" + $File.Name), Create
$Writer = New-Object System.IO.BinaryWriter($Stream)
$Writer.write($Binary)
$Writer.Close()
}
}
This is pretty basic; the variables up top are where on your local machine you wish to store the download files ($Destination), the URL of your SharePoint Site/Web ($Web) and the name of the Document Library (Your Doc Library Name).
The script then iterates through the items in the Library (foreach ($DocLibItem in $DocLibItems) {}), optionally filters for say items with a .docx file extension and downloads each to your local machine.
You could customize this further by targeting a specific sub-folder within the Doc Library, filter by metadata or properties of the Docs or even iterate over multiple Sites, Webs and/or Libraries in one script, optionally filtering those based on similar properties.

Related

How to download two different files in powershell

I have this simple script that is working and I'm able to download a csv file From my SharePoint site but I'm just wondering how can I modify it so that I can download another file that is located in another path and save it in the same local folder as soon as the first one is complete downloading?.
Basically this is what I'm trying to do
Download this first
$SourceFile="/sites/msteams/ReportsFull/OneDrive.csv"
Download this second, save in the same folder with the first one
$SourceFile="/sites/msteams/Shared%20Documents/OneDriveInventory/ActiveLitHoldWithOneDrive.csv"
\#Set parameter values
$SiteURL="https://companyName.sharepoint.com"
$SourceFile="/sites/msteams/ReportsFull/OneDrive.csv"
$TargetFile="C:\\Users\\AS\\Downloads\\New folder\\LegalDoc.docx"
Function Download-FileFromLibrary()
{
param
(
\[Parameter(Mandatory=$true)\] \[string\] $SiteURL,
\[Parameter(Mandatory=$true)\] \[string\] $SourceFile,
\[Parameter(Mandatory=$true)\] \[string\] $TargetFile
)
Try {
#Setup Credentials to connect
$Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Global:adminUPN, $Global:adminPwd)
#Setup the context
$Ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
$Ctx.Credentials = $Credentials
#sharepoint online powershell download file from library
$FileInfo = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($Ctx,$SourceFile)
$WriteStream = [System.IO.File]::Open($TargetFile,[System.IO.FileMode]::Create)
$FileInfo.Stream.CopyTo($WriteStream)
$WriteStream.Close()
Write-host -f Green "File '$SourceFile' Downloaded to '$TargetFile' Successfully!" $_.Exception.Message
}
Catch {
write-host -f Red "Error Downloading File!" $\_.Exception.Message
}
}
\#Call the function to download file
Download-FileFromLibrary -SiteURL $SiteURL -SourceFile $SourceFile -TargetFile $TargetFile
You've got the right code, just need to call the function twice
$sourceFile1 = '/sites/msteams/ReportsFull/OneDrive.csv'
$targetFile1 = 'C:\\Users\\AS\\Downloads\\New folder\\LegalDoc.docx'
Download-FileFromLibrary -SiteURL $SiteURL -SourceFile $SourceFile1 -TargetFile $TargetFile1
$sourceFile2 = '/sites/msteams/Shared%20Documents/OneDriveInventory/ActiveLitHoldWithOneDrive.csv'
$targetFile2 = 'C:\\Users\\AS\\Downloads\\New Folder\\NewFile2.docx'
Download-FileFromLibrary -SiteURL $siteUrl -SourceFile $sourceFile2 -TargetFile $targetFile2```

Recursively Upload Large Files Via PNP-Powershell to Sharepoint Online Using Large Chunks

I have some files that I need to keep in sync with my SharePoint Document Library. The problem is some of the files are over the 250mb limit the code I currently have works but only for files under that 250mb limit. I cannot figure out how to upload the same recursive files I need to SharePoint using the same code and large chunks? It seems that what I need to integrate is option 3 of this page -> LARGE FILE HANDLING - OPTION 3 (StartUpload, ContinueUpload and FinishUpload) also it seems that if (Test-Path($topSPOFolder+"\"+$aFileName.FullName)) is not checking the target directory if a file exists to skip and move on. Any help is always appreciated.
$password = ConvertTo-SecureString "test123" -AsPlainText -Force
$username = "abc#def.com"
$cred = New-Object System.Management.Automation.PSCredential ($username, $password)
$makeUrl ="https://helloworld.sharepoint.com/sites/helloworld"
$sourcePath = "\\1.2.3.4\e$\MSI\packages\*\*.msi";
$topSPOFolder = "Shared Documents\packages\test";
# install pnp powershell..?
#Install-Module SharePointPnPPowerShellOnline
# connect to spo online
Connect-PnPOnline -Url $makeUrl -Credentials $cred
$fileNames = Get-ChildItem -Path $sourcePath -Recurse ;
foreach($aFileName in $fileNames)
{
if($aFileName.GetType().Name -ne "DirectoryInfo")
{
if (Test-Path($topSPOFolder+"\"+$aFileName.FullName))
{
Write-Host 'Skipping file, already downloaded' -ForegroundColor Yellow
return
} else {
$fn=$topSPOFolder;
Add-PnPFile -Path $aFileName.FullName -Folder $fn;
$fn=$null
}
}
}

PowerShell Script to download an entire folder to FTP [duplicate]

I like to write a PowerShell script to download all files and subfolders from my FTP server. I found a script to download all files from one specific folder, but I also like to download the subfolders and their files.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://ftp.abc.ch/"
$user = 'abc'
$pass = 'abc'
$folder = '/'
$target = "C:\LocalData\Powershell\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.*"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Thanks for your help (:
The .NET framework or PowerShell do not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest or WebClient). The .NET framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
Though Microsoft does not recommend FtpWebRequest for a new development.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
The Session.GetFiles method is recursive by default.
(I'm the author of WinSCP)
For retrieving files /folder from FTP via powerShell I wrote some functions, you can get even hidden stuff from FTP.
Example for getting all files and subfolders (even hidden ones) in a specific folder:
Get-FtpChildItem -ftpFolderPath "ftp://myHost.com/root/leaf/" -userName "User" -password "pw" -Directory -File
You can just copy the functions from the following module without needing any 3rd library installed:
https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1

Download entire solution or specific branch from TFS to local folder

Hi all I got the below script to download a file from TFS, using powershell script, but I need to download entire solution how can I achieve that
cls
$tfsCollectionUrl = New-Object System.URI("http://localhost:8080/tfs/defaultcollection");
[Microsoft.TeamFoundation.Client.TfsTeamProjectCollection] $tfsCollection = Get-TfsServer $tfsCollectionUrl
$VersionControl = $tfsCollection.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer])
$DestinationFile = [IO.Path]::GetTempFileName()
$VersionControl.DownloadFileByUrl('$/MyFirstProject/WebApplication1/WebApplication1/WebForm1.aspx.cs', $DestinationFile)
Invoke-Item $DestinationFile
Also this is not checking whether he is having permission to download , I would like to prompt for username and password instead of downloading directly. Can I achieve the same for bitbucket too if so how can I
The same code converted to powershell
connect to tfs and download the files present in it VS2010
For credential use the above logic
Write-Host "Enter source location "
$sourceLocation = Read-Host
$tfsCollectionUrl = New-Object System.URI($sourceLocation);
Write-Host "Enter server path "
$serverPath = Read-Host
Write-Host "Enter local path to download"
$localPath = Read-Host
[Microsoft.TeamFoundation.Client.TfsTeamProjectCollection] $tfsCollection = Get-TfsServer $tfsCollectionUrl
$VersionControl = $tfsCollection.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer])
$latest = [Microsoft.TeamFoundation.VersionControl.Client.VersionSpec]::Latest
$recursionType = [Microsoft.TeamFoundation.VersionControl.Client.RecursionType]::Full
try
{
foreach ($item in $VersionControl.GetItems($serverPath, $latest,$recursionType).Items)
{
$target = [io.path]::Combine($localPath,$item.ServerItem.Substring(2))
$exists=[System.IO.Directory]::Exists($target)
if($item.ItemType -eq "Folder" -and !$exists)
{
New-Item $target -Type Directory
}
if($item.ItemType -eq "File")
{
$item.DownloadFile($target)
}
}
Write-Host "`n Successfully downloaded all the files to the target folder: " $localPath -ForegroundColor Green
}
catch
{
$ErrorMessage = $_.Exception.Message
$FailedItem = $_.Exception.ItemName
Break
}
An easy way is to have a workspace mapping multiple projects and run the tf get command from Powershell.
You can also manage workspace with tf workspace command line.
The simple way is that:
Create a workspace
Map a workspace
Get all files.
Sample code:
$tfsCollectionUrl = New-Object System.URI("[team project collection url");
$username="[user name]"
$password="[password]"
$domain="[domain]"
$cret = new-object System.Net.NetworkCredential($username, $password, $domain)
$teamProjectCollection=new-object Microsoft.TeamFoundation.Client.TfsTeamProjectCollection($tfsCollectionUrl,$cret)
$teamProjectCollection.EnsureAuthenticated()
$VersionControl = $teamProjectCollection.GetService([Microsoft.TeamFoundation.VersionControl.Client.VersionControlServer])
$workspace = $VersionControl.CreateWorkspace("BasicSccExamplePS", $VersionControl.AuthorizedUser);
$workspace.Map("[file or folder server path, for example:$/TestTeam/FolderA]", "[local path]")
$workspace.Get()
You can find that it lets you to provide the credential, so you could let user to provide username, password, then connect to TFS with that account.
Regards

PowerShell FTP download files and subfolders

I like to write a PowerShell script to download all files and subfolders from my FTP server. I found a script to download all files from one specific folder, but I also like to download the subfolders and their files.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://ftp.abc.ch/"
$user = 'abc'
$pass = 'abc'
$folder = '/'
$target = "C:\LocalData\Powershell\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.*"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Thanks for your help (:
The .NET framework or PowerShell do not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest or WebClient). The .NET framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
Though Microsoft does not recommend FtpWebRequest for a new development.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
The Session.GetFiles method is recursive by default.
(I'm the author of WinSCP)
For retrieving files /folder from FTP via powerShell I wrote some functions, you can get even hidden stuff from FTP.
Example for getting all files and subfolders (even hidden ones) in a specific folder:
Get-FtpChildItem -ftpFolderPath "ftp://myHost.com/root/leaf/" -userName "User" -password "pw" -Directory -File
You can just copy the functions from the following module without needing any 3rd library installed:
https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1