Sharepoint online copy file script - powershell

I am having trouble copying a file within a sharepoint online list using powershell. Error I am getting is
Exception calling "ExecuteQuery" with "0" argument(s): "Server relative urls must start with SPWeb.ServerRelativeUrl"
The path is correct as i can combine context.url with the path variables and access the file using that path. I used similar paths except with getfolderbyrelativeurl to set permissions on folders with no issues (same list).
Here is the code.
$Context = New-Object Microsoft.SharePoint.Client.ClientContext($SiteUrl)
$SourceFile =$context.Web.GetFileByServerRelativeUrl("/$ListName/$sa_man_checklist")
$Context.Load($SourceFile)
$Context.ExecuteQuery()
I am very new sharepoint online and any help is much appreciated

Found the cause, not sure how anything before this worked. Server relative url should have started after host name rather than what i specified in context. Odd thing is when i called folder by server relative path taking url specified in context it still works just fine but when i try to call a file using that same method it breaks...

Related

Powershell : Get-ChildItem throw "path too long" error despite my path are short

I am facing a strange error in powershell. I am currently trying to copy files from a serveur (say Z:) to a folder in my laptop.
I want to perform some check on each file that is why I do not want to use a robotcopy.
When i run the command
Get-ChildItem -Path Y:\ -Recurse | Select-Object FullNameLength
I get an error (in french) telling that the path is too long for some items (260 caracters). But I checked and the item path are not that long. For example
Y:\00.P1.2020.211-MEP 2020-11-01-RENOM\06 - Chantier
My Y:\ is a sharepoint online linked as a network drive.
Do you know where it could come from
Thanks in advance
Well it comes from the fact that a smart guy had put a file with a name of 308 characters in the folder. So I wasn't even able to see it in the explorer but it was still there (in the sharepoint) to raise the error.
I let it there if it can help someone else

Copy-PnPFile returns File Not Found

I am writing a PowerShell script that copys all files from one document library to a different one in SharePoint Online.
I can't get my head around the behavior.
When I copy the whole folder, then all the files get copied and it works fine:
Copy-PnPFile -SourceUrl "Old Library/" -TargetUrl "NewLibrary/"
This does not help me though, because I need to log each file and I can't do it this way. So I need to copy each file individually.
When I copy the files individually, I get an error message "File Not Found":
Copy-PnPFile -SourceUrl "Old Library/item.docx" -TargetUrl "NewLibrary/item.docx"
I have already tried using different paths:
relative to site
relative to root
full domain
Anybody has got an idea what the problem might be?
How can I copy the files and log individual files (name, path, success)?
The Copy-PnPFile cmdlet isn’t working correctly and an issue has been raised in the PnP PowerShell GitHub repo.
This has been ongoing for about a month now, but a fix should be introduced shortly.
https://github.com/SharePoint/PnP-PowerShell/issues/2103
I've faced with the same issue and found only one working way to copy files via Copy-PnPFile across site collections within the same tenant: copy whole root folder:
Copy-PnPFile -SourceUrl Documents -TargetUrl /sites/otherproject/Documents -SkipSourceFolderName
However it will also try to copy allitems.aspx. So full solution will be this:
$error = $null
Copy-PnPFile -SourceUrl Documents -TargetUrl /sites/otherproject/Documents -SkipSourceFolderName -ErrorAction SilentlyContinue -ErrorVariable error
if ($error -and !$error.Exception.Message.ToLower().Contains("allitems.aspx")) {
throw $error
}

Checking if a UNC Path/Server Folder Exists

I am working on a project that utilizes a PowerShell script that creates a new login on a remote SQL Server (SSMS) and then checks to see if a particular folder exists on the server. If the folder does not already exist, the script will create that folder.
The issue I am having is that I cannot verify whether or not the folder exists since the path I am testing is a UNC path of the form "\\server\Files\Log". I have tried many different solutions that I have found through a couple hours of searching online, and all solutions return FALSE even though I am testing a server and folder I know already exist.
I am using PowerGUI to write my script and my system is using PowerShell v5. What I have tried so far:
Test-Path $path (where $path has been set to \\server)
Test-Path "filesystem::\\Srv"
[System.IO.Directory]::Exists($path)
I even tried [System.IO.Directory]::Exists('G:\') using all of the
letters I have network servers mapped to to see if I needed to map to the drives to make it work (all returned FALSE)
What am I missing here? Any thoughts on this topic would be greatly appreciated as I have been grinding on this for a while with no progress being made.
EDIT: For anyone who might stumble upon this later, please read the comments, which I found to be super helpful. My main issue was that I was running PowerShell as an administrator which does not have the same permissions as my normal user account. Also note that Test-Path \\server alone does not work, a folder must also be referenced.
You already have the correct answer:
Test-Path $path
or
Test-Path \\server.domain.tld\ShareName
If Test-Path is returning false, I can think of three things that could be wrong:
The share does not exist on that server, or at least with the name you expect
Your user does not have permission to read that share
You are specifying the short name of the server, and you need the FQDN to resolve it. This is common in multidomain environments.
After re-reading your question, it looks like you might be running Test-Path \\server. You cannot check for the existence of a server this way, you have to specify both the server and the share name at a minimum. If you want to know that a server exists and is online, use Test-Connection (assuming you are able to ping this server in the first place). Here is an example of using Test-Connection:
$serverName = 'server.domain.tld'
$sharePath = 'ShareName' # you can append more paths here
if( Test-Connection $serverName 2> $null ){
Test-Path "\\${serverName}\${sharePath}"
}
I used to have an issue where the file existed, but test-path is returning false. I put test-path in a loop that checks a maximum of 10 x, with a one sec pause in between. The script works fine now. If on the first attempt it does not find the file on the second or third it does. Not sure why it returns false on the first attempt.

Error in PowerShell due to copying the content of an S3 bucket

I copy the content of an S3 bucket to a local directory, however I get an error output from the powershell.
Copy-S3Object : The requested range is not satisfiable
It is pointing to this command:
Copy-S3Object -BucketName $bucket -Key $object.Key -LocalFile $localFilePath -Region $region
Why do I get this error ? Note that the desired files that are needed to be copied do indeed get copied locally.
I can't say why you are getting that error returned from S3, but I can tell you that if you are copying multiple objects you probably want to use the -LocalFolder parameter, not -LocalFile. -LocalFolder will preserve the prefixes as subpaths.
When downloading one or more objects from S3, the Read-S3Object cmdlet works the same as Copy-S3Object, but uses -KeyPrefix to specify the common prefix the objects share, and -Folder to indicate the folder they should be downloaded to.
This also reminds me I need to check why we used -LocalFolder on Copy-, and -Folder on Read- although I suspect aliases may also be available to make them consistent.
HTH
(Edit): I spent some time this morning reviewing the cmdlet code and it doesn't appear to me the cmdlet would work as-is on a multi-object download, even though it has a -LocalFolder parameter. If you have a single object to download, then using -Key/-LocalFile is the correct parameter combination. If -LocalFolder is passed, the cmdlet sets up internally to do a single file download instead of treating -Key as a common key prefix to a set of objects. So, I think we have a bug here that I'm looking into.
In the meantime, I would use Read-S3Object to do your downloads. It supports both single (-Key) or multi-object download (-KeyPrefix) modes. https://docs.aws.amazon.com/powershell/latest/reference/index.html?page=Read-S3Object.html&tocid=Read-S3Object
this seems to occur with folders that do not contain files since copy wants to copy files.
i accepted this error and trapped it.
catch [Amazon.S3.AmazonS3Exception]
{
# get error record
[Management.Automation.ErrorRecord]$e = $_
# retrieve information about runtime error
$info = [PSCustomObject]#{
Exception = $e.Exception.Message
Reason = $e.CategoryInfo.Reason
Target = $e.CategoryInfo.TargetName
Script = $e.InvocationInfo.ScriptName
Line = $e.InvocationInfo.ScriptLineNumber
Column = $e.InvocationInfo.OffsetInLine
ErrorCode = $e.Exception.ErrorCode
}
if ($info.ErrorCode="InvalidRange") { #do nothing
} Else {
# output information. Post-process collected info, and log info (optional)
write-host $info -ForegroundColor Red}
}
}
This happened to me when I tried to download the file which had more than one dots in it. Simplifying the file name, fixed the error.
File name that gave me error: myfile-18.10.exe
File name that worked: myfile-1810.exe

Powershell to find machine that created a file

I have a script that monitors the filesystem using FileWatcher.IO in Powershell.
Currently it finds the user that made the file with:
$owner = (Get-Acl $path).Owner
And it finds the computer that the file was made on with:
$Computer = get-content env:computername
But I'd also like to obtain what machine the file was created from. For instance, if a user is logged into a terminal server, I can see the file is made on the terminal server. But I want to know the host name of the local machine that made the file on the terminal server.
Is this possible? I've been searching the msdn PSEventArgs Class page without much success.
That information is not going to be stored in the file or its metadata, so no there's no straightforward way to get at it.
By the way, you can just use $env:computername directly as a variable; there's no need to use Get-Content.