Copy-PnPFile returns File Not Found - powershell

I am writing a PowerShell script that copys all files from one document library to a different one in SharePoint Online.
I can't get my head around the behavior.
When I copy the whole folder, then all the files get copied and it works fine:
Copy-PnPFile -SourceUrl "Old Library/" -TargetUrl "NewLibrary/"
This does not help me though, because I need to log each file and I can't do it this way. So I need to copy each file individually.
When I copy the files individually, I get an error message "File Not Found":
Copy-PnPFile -SourceUrl "Old Library/item.docx" -TargetUrl "NewLibrary/item.docx"
I have already tried using different paths:
relative to site
relative to root
full domain
Anybody has got an idea what the problem might be?
How can I copy the files and log individual files (name, path, success)?

The Copy-PnPFile cmdlet isn’t working correctly and an issue has been raised in the PnP PowerShell GitHub repo.
This has been ongoing for about a month now, but a fix should be introduced shortly.
https://github.com/SharePoint/PnP-PowerShell/issues/2103

I've faced with the same issue and found only one working way to copy files via Copy-PnPFile across site collections within the same tenant: copy whole root folder:
Copy-PnPFile -SourceUrl Documents -TargetUrl /sites/otherproject/Documents -SkipSourceFolderName
However it will also try to copy allitems.aspx. So full solution will be this:
$error = $null
Copy-PnPFile -SourceUrl Documents -TargetUrl /sites/otherproject/Documents -SkipSourceFolderName -ErrorAction SilentlyContinue -ErrorVariable error
if ($error -and !$error.Exception.Message.ToLower().Contains("allitems.aspx")) {
throw $error
}

Related

Unable to use Copy-PnP File to copy site pages across sharepoint online sites

I tried using all the below examples for copying my site pages across different SharePoint sites but always keep getting the below error: Can someone give correct syntax, please?
Copy-PnPFile : Cannot contact site at the specified URL
Copy-PnPFile -SourceUrl SitePages/Home.aspx -TargetUrl "/sites/destination/SitePages"
Copy-PnPFile -SourceUrl SitePages -TargetUrl "/sites/destination/SitePages"
Copy-PnPFile -SourceUrl /sites/sourcesite/SitePages/Home.aspx -TargetUrl "/sites/destination/SitePages"
Did the TargetUrl sitr is valid ? "/sites/destination" should be existed in your Tenant.
I tested the PowerShell command below, it will pop up a confirm dialogue like this if the target url is correct:

Powershell : Get-ChildItem throw "path too long" error despite my path are short

I am facing a strange error in powershell. I am currently trying to copy files from a serveur (say Z:) to a folder in my laptop.
I want to perform some check on each file that is why I do not want to use a robotcopy.
When i run the command
Get-ChildItem -Path Y:\ -Recurse | Select-Object FullNameLength
I get an error (in french) telling that the path is too long for some items (260 caracters). But I checked and the item path are not that long. For example
Y:\00.P1.2020.211-MEP 2020-11-01-RENOM\06 - Chantier
My Y:\ is a sharepoint online linked as a network drive.
Do you know where it could come from
Thanks in advance
Well it comes from the fact that a smart guy had put a file with a name of 308 characters in the folder. So I wasn't even able to see it in the explorer but it was still there (in the sharepoint) to raise the error.
I let it there if it can help someone else

Moving Document Library to a subsite in SharePoint

Currently, we have Document Libraries created in SharePoint Online and would like to move them using Powershell to its own Subsite. The reason we would like to move them is that we would like to keep the version history. Since we are dealing with 1000s of files, I would like to use Powershell to complete this task.
I am currently connecting to my SharePoint site using:
Connect-PnPOnline -Url "Sitename" -UseWebLogin
Here is where I need assistance. I am trying to use Move-PnPFolder but I am not sure how to write a command that would define the source, destination, and move of all files in the document library to a subsite that I have manually created.
Here you need to mix of PnP and CSOM PS script... the normal way how we get the SharePoint list items... we need read the the list items then inside the foreach loop you need to call the below command : Move-PnPFolder -Folder 'Shared Documents/Reports/2016/Templates' -TargetFolder 'Shared Documents/Reports' Something like below :
foreach ($oneFolder in List. Folders) { Move-PnPFolder -Folder 'Shared Documents/Reports/2016/Templates' -TargetFolder 'Shared Documents/Reports' }
Note :
In the above command from the $oneFolder property you will get the source folder URL and you already know the target folder location.
This is just sample code to present the concept.

Checking if a UNC Path/Server Folder Exists

I am working on a project that utilizes a PowerShell script that creates a new login on a remote SQL Server (SSMS) and then checks to see if a particular folder exists on the server. If the folder does not already exist, the script will create that folder.
The issue I am having is that I cannot verify whether or not the folder exists since the path I am testing is a UNC path of the form "\\server\Files\Log". I have tried many different solutions that I have found through a couple hours of searching online, and all solutions return FALSE even though I am testing a server and folder I know already exist.
I am using PowerGUI to write my script and my system is using PowerShell v5. What I have tried so far:
Test-Path $path (where $path has been set to \\server)
Test-Path "filesystem::\\Srv"
[System.IO.Directory]::Exists($path)
I even tried [System.IO.Directory]::Exists('G:\') using all of the
letters I have network servers mapped to to see if I needed to map to the drives to make it work (all returned FALSE)
What am I missing here? Any thoughts on this topic would be greatly appreciated as I have been grinding on this for a while with no progress being made.
EDIT: For anyone who might stumble upon this later, please read the comments, which I found to be super helpful. My main issue was that I was running PowerShell as an administrator which does not have the same permissions as my normal user account. Also note that Test-Path \\server alone does not work, a folder must also be referenced.
You already have the correct answer:
Test-Path $path
or
Test-Path \\server.domain.tld\ShareName
If Test-Path is returning false, I can think of three things that could be wrong:
The share does not exist on that server, or at least with the name you expect
Your user does not have permission to read that share
You are specifying the short name of the server, and you need the FQDN to resolve it. This is common in multidomain environments.
After re-reading your question, it looks like you might be running Test-Path \\server. You cannot check for the existence of a server this way, you have to specify both the server and the share name at a minimum. If you want to know that a server exists and is online, use Test-Connection (assuming you are able to ping this server in the first place). Here is an example of using Test-Connection:
$serverName = 'server.domain.tld'
$sharePath = 'ShareName' # you can append more paths here
if( Test-Connection $serverName 2> $null ){
Test-Path "\\${serverName}\${sharePath}"
}
I used to have an issue where the file existed, but test-path is returning false. I put test-path in a loop that checks a maximum of 10 x, with a one sec pause in between. The script works fine now. If on the first attempt it does not find the file on the second or third it does. Not sure why it returns false on the first attempt.

Error in PowerShell due to copying the content of an S3 bucket

I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe