There is a back-end SQL DB contains "managed folders" in the form of UNC paths. Using SQL queries in PowerShell I have a loop that will work it's way through these folders and run a GCI operation against them to work out how much disk space they are using.
$managedFolder = "\\server\share\folder\subfolder"
For the sake of the question, $managedFolder is declared as above. The failing command below:
$diskTrendsInitialUsage = "{0:N2}" -f ((Get-ChildItem $managedFolder -Recurse -Force | Measure-Object -Property Length -Sum).Sum / 1GB)
Now if I were to run this command manually in PS console it's fine, it pulls back data. But as soon as it's packaged in a script, it fails with the below error. The folder is accessible from the server, as it works fine from a local PS console session.
ERROR: Get-ChildItem : Invalid Path: '\\server\share\folder\subfolder'.
AddManagedFolder.psf (17): ERROR: At Line: 17 char: 42
ERROR: + $diskTrendsInitialUsage = "{0:N2}" -f ((Get-ChildItem $managedFolder -Recurse ...
ERROR: + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
ERROR: + CategoryInfo : NotSpecified: (:) [Get-ChildItem], ArgumentException
ERROR: + FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.GetChildItemCommand
ERROR:
I'm stumped.
The problem with your path is that it does not have any indication about which provider to use, so PowerShell just use current one. And if current provider is not a file system provider, then it will fail. So you need to specify provider in path, to allow PowerShell to choose right one regardless of current provider:
$managedFolder = "filesystem::\\server\share\folder\subfolder"
My guess is your are using the SQL PS cmdlets prior to running GCI, this is changing your provider path to SQL: which is what is causing GCI to be unhappy.
Prior to running GCI do cd c:\ to change the path back to the file system and GCI will work.
Related
I employ a user-defined function, called searchfor, from a PowerShell console prompt (Run as Administrator) to find files containing strings:
PS repo> gc Function:\searchfor
param([string]$root, [string[]]$includeexpression, [string]$regexp)
$fullpath = convert-path $root
Get-ChildItem -force -recurse $fullpath -include $includeexpression | Select-String $regexp
This has recently started to fail on some files with the following "cannot be read/access denied" error message:
Select-String : The file C:\builds\repo\LightweightSerialization\LightweightSerializationWriter.cs cannot be read: Access to the path 'C:\builds\repo\LightweightSerialization\LightweightSerializationWriter.cs' is denied.
At C:\Users\schlagermeier\Documents\WindowsPowerShell\Microsoft.PowerShell_profile.ps1:174 char:71
+ ... recurse $fullpath -include $includeexpression | Select-String $regexp
+ ~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Select-String], ArgumentException
+ FullyQualifiedErrorId : ProcessingFile,Microsoft.PowerShell.Commands.SelectStringCommand
An attempt to read the file interactively with Get-Content fails with the same error. However, if I launch a Command Prompt (cmd.exe), also using Run as Administrator, I can read the file content with the type command.
I've looked at the file's permissions and acl, and it doesn't appear any different to others in the same folder which can be read by PowerShell. Can anybody suggest any possible causes and how these might be identified and fixed?
I'm trying to use a script I found on the internet to copy all files in a directory but I can't get it to work. Can anybody help debug? I'm guessing the script was used to transfer windows/windows but I need windows --> Linux.
https://www.powershellmagazine.com/2013/12/17/pstip-copying-folders-using-copy-vmfile-cmdlet-in-windows-server-2012-r2-hyper-v/
Get-ChildItem C:\tmp -Recurse -File | % { Copy-VMFile -Name "OpenProject8.3" -SourcePath $_.FullName -DestinationPath "/tmp/" -FileSource Host }
The issue seems to be related to the sourcepath, but im not 100%.
Copy-VMFile : Failed to initiate copying files to the guest.
Failed to copy the source file 'C:\tmp\svn-repositories-20200212010002.tar.gz' to the destination '/tmp/' in the guest.
At line:1 char:43
+ ... -File | % { Copy-VMFile -Name "OpenProject8.3" -SourcePath $_.FullNam ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-VMFile], VirtualizationException
+ FullyQualifiedErrorId : OperationFailed,Microsoft.HyperV.PowerShell.Commands.CopyVMFile
Well.... I feel stupid! The issue was that earlier testing the x-fer had succeeded and then when the file exists, Copy-VMFile won't overwrite the file (even though it uses root account...) and gives a non-descriptive error! The code above works fine as is.
Is there a limit to the size of the path the get-childitem and select-string can handle? If yes what is the alternative?
When I run the following command on the path:
PS E:\KINGSTON backup5\03 Learning\Softwares\Mathematica\Mathematica 12\Mathematica Directories Backup2\C,Users,atfai,AppData,Roaming,Mathematica\Paclets\Repository\SystemDocsUpdate1-12.0.0\Documentation\English\Workflows> get-childitem -recurse -filter "*.nb" -file | select-string -pattern ".*ProcessObject.*" -casesensitive
I get the following error
select-string : The file E:\KINGSTON backup5\03
Learning\Softwares\Mathematica\Mathematica 12\Mathematica Directories
Backup2\C,Users,atfai,AppData,Roaming,Mathematica\Paclets\Repository\SystemDocsUpdate1-12.0.0\Documentation\English\Workflows\ChangeTheStyleOfPointsInA2DScatterPlot.nb
cannot be read: Could not find a part of the path 'E:\KINGSTON
backup5\03 Learning\Softwares\Mathematica\Mathematica 12\Mathematica
Directories
Backup2\C,Users,atfai,AppData,Roaming,Mathematica\Paclets\Repository\SystemDocsUpdate1-12.0.0\Documentation\English\Workflows\ChangeTheStyleOfPointsInA2DScatterPlot.nb'.
At line:1 char:47
+ ... nb" -file | select-string -pattern ".ProcessObject." -casesensitive ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [Select-String], ArgumentException
+ FullyQualifiedErrorId : ProcessingFile,Microsoft.PowerShell.Commands.SelectStringCommand
Moreover if I run the same command on the following path:
PS E:\Computer Backup\Downloads - Current\Windows 10 Optimization\SoftwareDistribution.old3\Download\736aed4d238d4999f5ea5b04589077ed\Package_for_RollupFix~~amd64~~17134.677.1.6\x86_wcf-system.servicemodel_b03f5f7f11d50a3a_10.0.17134.254_none_d5ff175e12d127c0> get-childitem -recurse -filter "*.nb" -file | select-string -pattern ".*ProcessObject.*" -casesensitive
I get the error this time from get-childitem
get-childitem : Could not find a part of the path 'E:\Computer
Backup\Downloads - Current\Windows 10
Optimization\SoftwareDistribution.old3\Download\736aed4d238d4999f5ea5b
04589077ed\Package_for_RollupFix~~amd64~~17134.677.1.6\x86_wcf-system.servicemodel_b03f5f7f11d50a3a_10.0.17134.254_none_d5ff175e12d127c0'.
At line:1 char:1
+ get-childitem -recurse -filter "*.nb" -file | select-string -pattern ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ReadError: (E:\Computer Bac...5ff175e12d127c0:String) [Get-ChildItem],
DirectoryNotFoundException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChildItemCommand
What does it mean "Could not find a part of the path"? The drive E has NTFS file system which is supported by Windows so its powershell commands should be able to handle it? What is going on here?
BTW I can access both paths from the Windows explorer and open the files in the notepad. So the paths exist and files are clearly not corrupt or inaccessible.
The problem is, that long paths aren't enabled on your OS, so there is a limit of 260 characters.
Depending on the version of windows you are running, this can be fixed by enabling the group policy Local Computer Policy > Computer Configuration > Administrative Templates > System > Filesystem > NTFS > Enable NTFS long paths.
If you don't have that option, changing the value of the registry key LongPathsEnabled at HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem from 0 to 1 does the job as well.
If you can't use the registry or group policy fix of the other answer you may be able to work around this by using the prefix \\?\E:\folder1\...
To specify an extended-length path, use the "\?" prefix. For example, "\?\D:\very long path".
[...]
The "\?" prefix can also be used with paths constructed according to
the universal naming convention (UNC). To specify such a path using
UNC, use the "\?\UNC" prefix. For example, "\?\UNC\server\share",
where "server" is the name of the computer and "share" is the name of
the shared folder.
Ref: https://learn.microsoft.com/en-ca/windows/win32/fileio/maximum-file-path-limitation?tabs=cmd
Update
After a bit more testing, it seems that the file name is not the problem, as I can copy a new file of 0kb size with the same name without an error. However, the file I am trying to copy is around 8gb in size.
I am getting an annoying error when trying to copy a load of files from one drive to another. The Copy-Item command looks like this:
Copy-Item $oldLocation $newLocation -Recurse -Force
Where the parameters are:
$oldLocation = 'E:\Documents\Outlook Files\name#domain.co.za.pst'
$newLocation = 'F:\PST Files\EZ-SWAP EX\Documents\Outlook Files\name#domain.co.za.pst'
Which I have also tried on its own, in a seperated powershell window, and without the Recurse and Force switches, with the same result. I also tried the command without putting the paths in parameters and just specifying the strings.
Note that I am copying from 1 external hard drive to another external hard drive
They all seem to work except for 1 file, which throws the following error:
Copy-Item : The parameter is incorrect.
At line:4 char:1
+ Copy-Item $old $new -Force -Recurse
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-Item], IOException
+ FullyQualifiedErrorId : System.IO.IOException,Microsoft.PowerShell.Commands.CopyItemCommand
This file is unique in that the file name looks like this:
name#domain.co.za.pst
Where all the other files are just called
filename.pst
I'm not sure if the included domain is causing this, but could that be the issue?
If not, what could possible be going wrong here? The error message is not very helpful at all.
My $PSVersionTable.PSVersion outputs
Major Minor Build Revision
----- ----- ----- --------
5 1 14393 693
If you are running on any version of Windows 7 or earlier, or if the destination file system is of type FAT32 regardless of Windows version, you are limited to a maximum file size of 4GB. Since you indicate that the problem file is 8GB, and you've also indicated that a zero-byte file of the same name presents no problem, this is the most likely cause of your issue.
Try use double quotation to the path and then try...
Copy-Item "C:\PTS\1\Copy-Item\Old\name#domain.co.za.pst" -Destination "C:\PTS\1\Copy-Item\New\" -Recurse
I'm working with PowerShell, running a script (from my console) that includes this line:
$inpath = "C:\users\xxxxx\path\foo\bar"
and I keep getting this error:
Get-Content : Access to the path 'C:\users\xxxxx\path\foo\bar' is denied.
At C:\users\xxxxx\path\foo\testscript.ps1:53 char:12
+ Get-Content <<<< $txtfile | Get-WordCount -Exclude (Get-Content c:\temp\exclude.txt) | select -First 15
+ CategoryInfo : PermissionDenied: (C:\users\xxxxx\path\foo\bar:String) [Get-Content], UnauthorizedAcc
essException
+ FullyQualifiedErrorId : GetContentReaderUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetContentCommand
The scripts and target files are all located on my local drive. I can access the files in Explorer, view/edit/save them using NotePad, and do not have any permissions restrictions set. When I'm on the command line, I can run the get-content cmdlet successfully on files in my path. I can change directories PS C:> cd C:\users\xxxxx\path\foo\bar and successfully list what's there. Even more interesting, I can duplicate the line that's erroring in the script, and NOT receive an error on the command line.
PS C:\users\xxxxx\path\foo> $inpath = "C:\users\xxxxx\path\foo\bar"
PS C:\users\xxxxx\path\foo>
This makes me suspect that the 'Permission Denied' error is actually something else, or something vague enough that I've got no clue how to proceed with troubleshooting. Is it possible for PS to have different permissions than the user under which it's running? Has anyone seen this behavior before, and how did you solve the problem? I'm sure there's a simple solution that I don't know.
Get-Content : Access to the path 'C:\users\xxxxx\path\foo\bar' is denied.
At C:\users\xxxxx\path\foo\testscript.ps1:53 char:12
That path doesn't look like it is a file but a folder.
Are you sure you are appending the file name to the folder and passing that to Get-Content?
Windows gives Access Denied when you try and open a directory as if it were a file without passing extra flags; and .NET does not pass those flags (there are a few specific circumstances for opening a folder, but they do not apply here).
Get-Content read contents of file not folder. Please add . after your your folder path like below.
Get-Content "D:\Logs\*.*" | ?{($_|Select-String "test")}
If you want to go through all folders way under it then add -recurse like below:
Get-Content "D:\Logs\*.*" -Recurse | ?{($_|Select-String "test")}
Instead of this: (as per your comment)
foreach ($doc in $inpath) { do-function }
try this:
foreach ($doc in (gci $inpath)) { do-function }
You are doing a foreach on a string object instead of your folder items.