Recursively Upload Large Files Via PNP-Powershell to Sharepoint Online Using Large Chunks - powershell

I have some files that I need to keep in sync with my SharePoint Document Library. The problem is some of the files are over the 250mb limit the code I currently have works but only for files under that 250mb limit. I cannot figure out how to upload the same recursive files I need to SharePoint using the same code and large chunks? It seems that what I need to integrate is option 3 of this page -> LARGE FILE HANDLING - OPTION 3 (StartUpload, ContinueUpload and FinishUpload) also it seems that if (Test-Path($topSPOFolder+"\"+$aFileName.FullName)) is not checking the target directory if a file exists to skip and move on. Any help is always appreciated.
$password = ConvertTo-SecureString "test123" -AsPlainText -Force
$username = "abc#def.com"
$cred = New-Object System.Management.Automation.PSCredential ($username, $password)
$makeUrl ="https://helloworld.sharepoint.com/sites/helloworld"
$sourcePath = "\\1.2.3.4\e$\MSI\packages\*\*.msi";
$topSPOFolder = "Shared Documents\packages\test";
# install pnp powershell..?
#Install-Module SharePointPnPPowerShellOnline
# connect to spo online
Connect-PnPOnline -Url $makeUrl -Credentials $cred
$fileNames = Get-ChildItem -Path $sourcePath -Recurse ;
foreach($aFileName in $fileNames)
{
if($aFileName.GetType().Name -ne "DirectoryInfo")
{
if (Test-Path($topSPOFolder+"\"+$aFileName.FullName))
{
Write-Host 'Skipping file, already downloaded' -ForegroundColor Yellow
return
} else {
$fn=$topSPOFolder;
Add-PnPFile -Path $aFileName.FullName -Folder $fn;
$fn=$null
}
}
}

Related

How to download two different files in powershell

I have this simple script that is working and I'm able to download a csv file From my SharePoint site but I'm just wondering how can I modify it so that I can download another file that is located in another path and save it in the same local folder as soon as the first one is complete downloading?.
Basically this is what I'm trying to do
Download this first
$SourceFile="/sites/msteams/ReportsFull/OneDrive.csv"
Download this second, save in the same folder with the first one
$SourceFile="/sites/msteams/Shared%20Documents/OneDriveInventory/ActiveLitHoldWithOneDrive.csv"
\#Set parameter values
$SiteURL="https://companyName.sharepoint.com"
$SourceFile="/sites/msteams/ReportsFull/OneDrive.csv"
$TargetFile="C:\\Users\\AS\\Downloads\\New folder\\LegalDoc.docx"
Function Download-FileFromLibrary()
{
param
(
\[Parameter(Mandatory=$true)\] \[string\] $SiteURL,
\[Parameter(Mandatory=$true)\] \[string\] $SourceFile,
\[Parameter(Mandatory=$true)\] \[string\] $TargetFile
)
Try {
#Setup Credentials to connect
$Credentials = New-Object Microsoft.SharePoint.Client.SharePointOnlineCredentials($Global:adminUPN, $Global:adminPwd)
#Setup the context
$Ctx = New-Object Microsoft.SharePoint.Client.ClientContext($SiteURL)
$Ctx.Credentials = $Credentials
#sharepoint online powershell download file from library
$FileInfo = [Microsoft.SharePoint.Client.File]::OpenBinaryDirect($Ctx,$SourceFile)
$WriteStream = [System.IO.File]::Open($TargetFile,[System.IO.FileMode]::Create)
$FileInfo.Stream.CopyTo($WriteStream)
$WriteStream.Close()
Write-host -f Green "File '$SourceFile' Downloaded to '$TargetFile' Successfully!" $_.Exception.Message
}
Catch {
write-host -f Red "Error Downloading File!" $\_.Exception.Message
}
}
\#Call the function to download file
Download-FileFromLibrary -SiteURL $SiteURL -SourceFile $SourceFile -TargetFile $TargetFile
You've got the right code, just need to call the function twice
$sourceFile1 = '/sites/msteams/ReportsFull/OneDrive.csv'
$targetFile1 = 'C:\\Users\\AS\\Downloads\\New folder\\LegalDoc.docx'
Download-FileFromLibrary -SiteURL $SiteURL -SourceFile $SourceFile1 -TargetFile $TargetFile1
$sourceFile2 = '/sites/msteams/Shared%20Documents/OneDriveInventory/ActiveLitHoldWithOneDrive.csv'
$targetFile2 = 'C:\\Users\\AS\\Downloads\\New Folder\\NewFile2.docx'
Download-FileFromLibrary -SiteURL $siteUrl -SourceFile $sourceFile2 -TargetFile $targetFile2```

PowerShell Script to download an entire folder to FTP [duplicate]

I like to write a PowerShell script to download all files and subfolders from my FTP server. I found a script to download all files from one specific folder, but I also like to download the subfolders and their files.
#FTP Server Information - SET VARIABLES
$ftp = "ftp://ftp.abc.ch/"
$user = 'abc'
$pass = 'abc'
$folder = '/'
$target = "C:\LocalData\Powershell\"
#SET CREDENTIALS
$credentials = new-object System.Net.NetworkCredential($user, $pass)
function Get-FtpDir ($url,$credentials) {
$request = [Net.WebRequest]::Create($url)
$request.Method = [System.Net.WebRequestMethods+FTP]::ListDirectory
if ($credentials) { $request.Credentials = $credentials }
$response = $request.GetResponse()
$reader = New-Object IO.StreamReader $response.GetResponseStream()
$reader.ReadToEnd()
$reader.Close()
$response.Close()
}
#SET FOLDER PATH
$folderPath= $ftp + "/" + $folder + "/"
$Allfiles=Get-FTPDir -url $folderPath -credentials $credentials
$files = ($Allfiles -split "`r`n")
$files
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$counter = 0
foreach ($file in ($files | where {$_ -like "*.*"})){
$source=$folderPath + $file
$destination = $target + $file
$webclient.DownloadFile($source, $target+$file)
#PRINT FILE NAME AND COUNTER
$counter++
$counter
$source
}
Thanks for your help (:
The .NET framework or PowerShell do not have any explicit support for recursive file operations (including downloads). You have to implement the recursion yourself:
List the remote directory
Iterate the entries, downloading files and recursing into subdirectories (listing them again, etc.)
Tricky part is to identify files from subdirectories. There's no way to do that in a portable way with the .NET framework (FtpWebRequest or WebClient). The .NET framework unfortunately does not support the MLSD command, which is the only portable way to retrieve directory listing with file attributes in FTP protocol. See also Checking if object on FTP server is file or directory.
Your options are:
Do an operation on a file name that is certain to fail for file and succeeds for directories (or vice versa). I.e. you can try to download the "name". If that succeeds, it's a file, if that fails, it a directory.
You may be lucky and in your specific case, you can tell a file from a directory by a file name (i.e. all your files have an extension, while subdirectories do not)
You use a long directory listing (LIST command = ListDirectoryDetails method) and try to parse a server-specific listing. Many FTP servers use *nix-style listing, where you identify a directory by the d at the very beginning of the entry. But many servers use a different format. The following example uses this approach (assuming the *nix format)
function DownloadFtpDirectory($url, $credentials, $localPath)
{
$listRequest = [Net.WebRequest]::Create($url)
$listRequest.Method =
[System.Net.WebRequestMethods+Ftp]::ListDirectoryDetails
$listRequest.Credentials = $credentials
$lines = New-Object System.Collections.ArrayList
$listResponse = $listRequest.GetResponse()
$listStream = $listResponse.GetResponseStream()
$listReader = New-Object System.IO.StreamReader($listStream)
while (!$listReader.EndOfStream)
{
$line = $listReader.ReadLine()
$lines.Add($line) | Out-Null
}
$listReader.Dispose()
$listStream.Dispose()
$listResponse.Dispose()
foreach ($line in $lines)
{
$tokens = $line.Split(" ", 9, [StringSplitOptions]::RemoveEmptyEntries)
$name = $tokens[8]
$permissions = $tokens[0]
$localFilePath = Join-Path $localPath $name
$fileUrl = ($url + $name)
if ($permissions[0] -eq 'd')
{
if (!(Test-Path $localFilePath -PathType container))
{
Write-Host "Creating directory $localFilePath"
New-Item $localFilePath -Type directory | Out-Null
}
DownloadFtpDirectory ($fileUrl + "/") $credentials $localFilePath
}
else
{
Write-Host "Downloading $fileUrl to $localFilePath"
$downloadRequest = [Net.WebRequest]::Create($fileUrl)
$downloadRequest.Method =
[System.Net.WebRequestMethods+Ftp]::DownloadFile
$downloadRequest.Credentials = $credentials
$downloadResponse = $downloadRequest.GetResponse()
$sourceStream = $downloadResponse.GetResponseStream()
$targetStream = [System.IO.File]::Create($localFilePath)
$buffer = New-Object byte[] 10240
while (($read = $sourceStream.Read($buffer, 0, $buffer.Length)) -gt 0)
{
$targetStream.Write($buffer, 0, $read);
}
$targetStream.Dispose()
$sourceStream.Dispose()
$downloadResponse.Dispose()
}
}
}
Use the function like:
$credentials = New-Object System.Net.NetworkCredential("user", "mypassword")
$url = "ftp://ftp.example.com/directory/to/download/"
DownloadFtpDirectory $url $credentials "C:\target\directory"
The code is translated from my C# example in C# Download all files and subdirectories through FTP.
Though Microsoft does not recommend FtpWebRequest for a new development.
If you want to avoid troubles with parsing the server-specific directory listing formats, use a 3rd party library that supports the MLSD command and/or parsing various LIST listing formats; and recursive downloads.
For example with WinSCP .NET assembly you can download whole directory with a single call to Session.GetFiles:
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::Ftp
HostName = "ftp.example.com"
UserName = "user"
Password = "mypassword"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Download files
$session.GetFiles("/directory/to/download/*", "C:\target\directory\*").Check()
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
Internally, WinSCP uses the MLSD command, if supported by the server. If not, it uses the LIST command and supports dozens of different listing formats.
The Session.GetFiles method is recursive by default.
(I'm the author of WinSCP)
For retrieving files /folder from FTP via powerShell I wrote some functions, you can get even hidden stuff from FTP.
Example for getting all files and subfolders (even hidden ones) in a specific folder:
Get-FtpChildItem -ftpFolderPath "ftp://myHost.com/root/leaf/" -userName "User" -password "pw" -Directory -File
You can just copy the functions from the following module without needing any 3rd library installed:
https://github.com/AstralisSomnium/PowerShell-No-Library-Just-Functions/blob/master/FTPModule.ps1

How to use PowerShell to download files from SharePoint?

I've used the following sites to help me get this far and to troubleshoot.
Download file from SharePoint
How to download newest file from SharePoint using PowerShell
Mike Smith's Tech Training Notes SharePoint, PowerShell and .Net!
Upload file to a SharePoint doc library via PowerShell
Download latest file from SharePoint Document Library
How to iterate each folders in each of the SharePoint websites using PowerShell
PowerShell's Get-ChildItem on SharePoint Library
I am trying to download random files from SharePoint folder, and I have it working for when I actually know the file name and extension.
Working code with name and extension:
$SharePoint = "https://Share.MyCompany.com/MyCustomer/WorkLoad.docx"
$Path = "$ScriptPath\$($CustomerName)_WorkLoad.docx"
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Download Files
$WebClient = New-Object System.Net.WebClient
$WebClient.Credentials = New-Object System.Net.Networkcredential($UserName, $Password)
$WebClient.DownloadFile($SharePoint, $Path)
However, I don't seem to be able to figure out how to do it with multiple files of unknown names or extensions.
I have tried mapping a drive, only to end up with "drive mapping failed" & "The network path was not found." errors:
$SharePoint = Read-Host 'Enter the full path to Delivery Site'
$LocalDrive = 'P:'
$Credentials = Get-Credential
if (!(Test-Path $LocalDrive -PathType Container)) {
$retrycount = 0; $completed = $false
while (-not $completed) {
Try {
if (!(Test-Path $LocalDrive -PathType Container)) {
(New-Object -ComObject WScript.Network).MapNetworkDrive($LocalDrive,$SharePoint,$false,$($Credentials.username),$($Credentials.GetNetworkCredential().password))
}
$Completed = $true
}
Catch {
if ($retrycount -ge '5') {
Write-Verbose "Mapping SharePoint drive failed the maximum number of times"
throw "SharePoint drive mapping failed for '$($SharePoint)': $($Global:Error[0].Exception.Message)"
} else {
Write-Verbose "Mapping SharePoint drive failed, retrying in 5 seconds."
Start-Sleep '5'
$retrycount++
}
}
}
}
I've also used the following code with similar results or no results at all.
#Get User Information
$user = Read-Host "Enter your username"
$username = "$user#MyCompany"
$password = Read-Host "Enter your password" -AsSecureString
#Gathering the location of the Card Formats and Destination folder
$Customer = "$SharePoint\MyCustomer"
$Products = "$Path\$($CustomerName)\Products\"
#Get Documents from SharePoint
$credential = New-Object System.Management.Automation.PSCredential($UserName, $Password)
New-PSDrive -Credential $credential -Name "A" -PSProvider "FileSystem" -Root "$SharePoint"
net use $spPath #$password /USER:$user#corporate
#Get PMDeliverables file objects recursively
Get-ChildItem -Path "$Customer" | Where-Object { $_.name -like 'MS*' } | Copy-Item -Destination $Products -Force -Verbose
Without defined "input parameters", it's not exactly clear the full solution you need so I'll provide a few snippets of PowerShell that should be of use based on what you've described.
I'll spare you the basics of the various OOTB functions (i.e. Get-SPWeb, etc) though can provide those details as well if needed. I've also been overly explicit in the scripting, though know some of these lines could be chained, piped, etc to be made shorter & more efficient.
This example will iterate over the contents of a SharePoint Library and download them to your local machine:
$Destination = "C:\YourDestinationFolder\ForFilesFromSP"
$Web = Get-SPWeb "https://YourServerRoot/sites/YourSiteCollection/YourSPWebURL"
$DocLib = $Web.Lists["Your Doc Library Name"]
$DocLibItems = $DocLib.Items
foreach ($DocLibItem in $DocLibItems) {
if($DocLibItem.Url -Like "*.docx") {
$File = $Web.GetFile($DocLibItem.Url)
$Binary = $File.OpenBinary()
$Stream = New-Object System.IO.FileStream($Destination + "\" + $File.Name), Create
$Writer = New-Object System.IO.BinaryWriter($Stream)
$Writer.write($Binary)
$Writer.Close()
}
}
This is pretty basic; the variables up top are where on your local machine you wish to store the download files ($Destination), the URL of your SharePoint Site/Web ($Web) and the name of the Document Library (Your Doc Library Name).
The script then iterates through the items in the Library (foreach ($DocLibItem in $DocLibItems) {}), optionally filters for say items with a .docx file extension and downloads each to your local machine.
You could customize this further by targeting a specific sub-folder within the Doc Library, filter by metadata or properties of the Docs or even iterate over multiple Sites, Webs and/or Libraries in one script, optionally filtering those based on similar properties.

FTPS Upload in Powershell

I'm in the process of learning Powershell, and am working on a little script that will upload a group of files to an FTPS server nightly. The files are located on a network share in a sub-directory containing the date in the name. The files themselves will all begin with the same string, let's say "JONES_". I have this script working for FTP, but I don't quite get what I need to do to get it to work for FTPS:
# Set yesterday's date (since uploads will happen at 2am)
$YDate = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
#Create Log File
$Logfile = "C:\powershell\$YDate.log"
Function LogWrite
{
Param ([string]$logstring)
Add-Content $Logfile -value $logstring
}
# Find Directory w/ Yesterday's Date in name
$YesterdayFolder = Get-ChildItem -Path "\\network\storage\location" | Where-Object {$_.FullName.contains($YDate)}
If ($YesterdayFolder) {
#we specify the directory where all files that we want to upload are contained
$Dir= $YesterdayFolder
#ftp server
$ftp = "ftp://ftps.site.com"
$user = "USERNAME"
$pass = "PASSWORD"
$webclient = New-Object System.Net.WebClient
$webclient.Credentials = New-Object System.Net.NetworkCredential($user,$pass)
$FilesToUpload = Get-ChildItem -Path (Join-Path $YesterdayFolder.FullName "Report") | Where-Object {$_.Name.StartsWith("JONES","CurrentCultureIgnoreCase")}
foreach($item in ($FilesToUpload))
{
LogWrite "Uploading file: $YesterdayFolder\Report\$item"
$uri = New-Object System.Uri($ftp+$item.Name)
$webclient.UploadFile($uri, $item.FullName)
}
} Else {
LogWrite "No files to upload"
}
I'd rather not have to deal with a 3rd party software solution, if at all possible.
Using psftp didn't work for me. I couldn't get it to connect to the FTP over SSL. I ended up (reluctantly?) using WinSCP with this code:
$PutCommand = '& "C:\Program Files (x86)\WinSCP\winscp.com" /command "open ftp://USER:PASS#ftps.hostname.com:21/directory/ -explicitssl" "put """"' + $Item.FullName + '""""" "exit"'
Invoke-Expression $PutCommand
In the foreach loop.
I'm not sure if you would consider this as "3rd party software" or not, but you can run PSFTP from within Powershell. Here is an example of how you could do that (source):
$outfile=$YesterdayFolder"\Report\"$item.Name
"rm $outfile`nput $outfile`nbye" | out-file batch.psftp -force -Encoding ASCII
$user = "USERNAME"
$pass = "PASSWORD"
&.\psftp.exe -l $user -pw $pass $ftp -b batch.psftp -be

Random string showing up in exported CSV

I have the powershell script built and I'm getting a "Random" bit of output into the CSV file. The string is MailboxExport(and a number). It looks like a value that (Get-MailboxExportRequest).name would return but I can't see where I would pull something like that or how it is being inserted. I think I may have just been staring at it too long and I may just need a fresh pair of eyes to spot my mistake. I would go into what the script is trying to do but I've put quite a few notes in the script that should explain it fairly well.
################################################## PST Extraction Script ##################################################
# Completed October 2013 by Trey Nuckolls
#
# This script is meant to extract PST files from the Site 1 Exchange server at the Site2 site and deliver those PST
# files to a share on the Site2 network. The script will change the input CSV file to keep track of which PSTfiles have been
# extracted and when that occoured. The script will also set security on the PST file so only the user and IT administraion
# can access the PST file.
#
# To run this script, enter the username of the Site 1 domain account that you want to target for extraction of a PST file then
# Run the script. Can be run from any machine on the network as long as it is run by someone with domain admin rights on the
# Site 2 network. Powershell v2 or v3 is required to run the script.
#
#############################################################################################################################
$InPstPath = '\\Site1_Server\PST_Store'
$OutPstPath = '\\Site2_Server\PST_Store'
$AdminPath = '\\Site2_Server\PST_Store\Admin\'
#Container for Site1 username
$User = Get-Content $AdminPath'login.txt'
#Container for encrypted Site1 Password
$PWord = Cat $AdminPath'pass.txt' | ConvertTo-SecureString
#Credential package for accessing Site1 resouces
$Credentials = New-Object –TypeName System.Management.Automation.PSCredential –ArgumentList $User, $PWord
#Creation of Powershell Drives for use during session
New-PSDrive -Name Site1Share -PSProvider FileSystem -Root $InPstPath -Credential $Credentials
New-PSDrive -Name Site2Share -PSProvider FileSystem -Root $OutPstPath
#Container for Powershell session to Exchange server
$PSSession = New-PSSession -ConfigurationName Microsoft.Exchange -ConnectionUri http://Site1_Server/powershell -Credential $Credentials
#Creation of Powershell session to Site1 Exchange server, including import of exchange commandlets
Import-PSSession $PSSession
#Import of the CSV file that lists users to be targeted
$In_List = Invoke-Command {Import-Csv "\\Site1_Server\PST_Store\To_Be_Exported.csv"} -computername Site1_Server -Credential $Credentials
$Processed = foreach ($objUser in $In_List) {
if ($objUser.Completed -ne "Yes") {
$TargetUser = $objUser.name
$ShortDate = (Get-Date).toshortdatestring()
$SourceFile = "Site1Share:\$TargetUser.pst"
$DestinationFile = "Site2Share:\$TargetUser.pst"
#Export Mailbox to PST File
New-MailboxExportRequest -Mailbox $TargetUser -Filepath $InPstPath\$TargetUser.pst
do {Start-Sleep -Seconds 10}
until((Get-MailboxExportRequest -Status InProgress).count -eq 0)
#Copy PST File to PST Share
Copy-Item -Path $SourceFile -Destination $DestinationFile
#Add Security access on PST file (Target_User-Modify). Domain Admin-Full is inherited from parent.
$Acl = Get-Acl $DestinationFile
$Permission = "Site2_Domain\$TargetUser","Modify","Allow"
$AccessRule = New-Object System.Security.AccessControl.FileSystemAccessRule $Permission
$Acl.SetAccessRule($AccessRule)
$Acl | Set-Acl $DestinationFile
#Remove PST file From Temporary area
Remove-Item -Path $SourceFile -Force
#Write back to checklist for new items that have just been processed
[PSCustomObject]#{Name=$TargetUser;Completed="Yes";Date=$ShortDate}
} else { if ($objUser.Completed -eq "Yes") {
#Passthrough of items that have already been completed
[PSCustomObject]#{Name=$objUser.name;Completed=$objUser.Completed;Date=$objUser.Date}}
}}
#Output the new version of the checklist
$Processed | export-csv -Path C:\TEMP\processed.csv
#Overwrite the old version checklist with the new one
Move-Item -Path C:\TEMP\processed.csv -Destination Site1Share:\To_Be_Exported.csv -force
#Cleanup PsDrives and PsSessions
Remove-PSDrive -Name Site1Share
Remove-PSDrive -Name Site2Share
Remove-PSSession -Session (Get-PSSession)
Input CSV is...
"Name","Completed","Date"
"User1","Yes","10/8/2013"
"User2","Yes","10/11/2013"
"User3",,
and output is...
"Name","Completed","Date"
"User1","Yes","10/8/2013"
"User2","Yes","10/11/2013"
"MailboxExport7",,
"User3","Yes","10/11/2013"
It is indeed very likely that the issue is caused by New-MailboxExportRequest, as you already suspected. The cmdlet prints information about the created object, which lumped together with the rest of the output you create in the loop, and then assigned to the variable $Processed.
To avoid this you can suppress the cmdlet output like this:
New-MailboxExportRequest -Mailbox ... | Out-Null
or like this:
New-MailboxExportRequest -Mailbox ... >$null
Assigning the output to a variable should work as well:
$exportRequest = New-MailboxExportRequest -Mailbox ...
On you Export-CSV, try adding the flag: "-NoTypeInformation"
I think this may be some sort of name space crossover issue between the custom object and another existing object (probably the mailboxexportrequest object on the exchange server). After messing around with this for a while I was able to get it to fail in a new way where the resultant csv file was full of details from the mailbox exports and their was a 'name' column that also had listed the usernames. I changed the hashes on the input csv from 'name to 'username' and the resultant MailboxExport entries have ceased. There are now blank row but I'm certainly willing to live with that imperfection as it doesn't break this (short lived) process.
If anyone has any insight into the root cause I'd certainly love to hear what it is but I think I've figured out a solution to the point that I can live with.