Copy-Item fails on large file - powershell

Update
After a bit more testing, it seems that the file name is not the problem, as I can copy a new file of 0kb size with the same name without an error. However, the file I am trying to copy is around 8gb in size.
I am getting an annoying error when trying to copy a load of files from one drive to another. The Copy-Item command looks like this:
Copy-Item $oldLocation $newLocation -Recurse -Force
Where the parameters are:
$oldLocation = 'E:\Documents\Outlook Files\name#domain.co.za.pst'
$newLocation = 'F:\PST Files\EZ-SWAP EX\Documents\Outlook Files\name#domain.co.za.pst'
Which I have also tried on its own, in a seperated powershell window, and without the Recurse and Force switches, with the same result. I also tried the command without putting the paths in parameters and just specifying the strings.
Note that I am copying from 1 external hard drive to another external hard drive
They all seem to work except for 1 file, which throws the following error:
Copy-Item : The parameter is incorrect.
At line:4 char:1
+ Copy-Item $old $new -Force -Recurse
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Copy-Item], IOException
+ FullyQualifiedErrorId : System.IO.IOException,Microsoft.PowerShell.Commands.CopyItemCommand
This file is unique in that the file name looks like this:
name#domain.co.za.pst
Where all the other files are just called
filename.pst
I'm not sure if the included domain is causing this, but could that be the issue?
If not, what could possible be going wrong here? The error message is not very helpful at all.
My $PSVersionTable.PSVersion outputs
Major Minor Build Revision
----- ----- ----- --------
5 1 14393 693

If you are running on any version of Windows 7 or earlier, or if the destination file system is of type FAT32 regardless of Windows version, you are limited to a maximum file size of 4GB. Since you indicate that the problem file is 8GB, and you've also indicated that a zero-byte file of the same name presents no problem, this is the most likely cause of your issue.

Try use double quotation to the path and then try...
Copy-Item "C:\PTS\1\Copy-Item\Old\name#domain.co.za.pst" -Destination "C:\PTS\1\Copy-Item\New\" -Recurse

Related

Powershell cmdlet: move file to Team Drive

I'm having a hard time moving a file from a local directory and into Team Drive.
I have a feeling I may be forced to step away from PS and find another route, which I really don't want to but here goes.
This command does not work:
Move-Item -Path 'C:\Program Files (x86)\FieldSmart View\Logs\BgSync.log' -Destination 'G:\Team Drives\LGE Prints\Logs\$env:computername.txt'
This command does work:
Move-Item -Path 'C:\Program Files (x86)\FieldSmart View\Logs\BgSync.log' -Destination C:\Users\ITAdmin\Desktop\Test\$env:computername.txt
The only difference is the destination.
When trying to move a file into Team Drive this is the error that is returned:
Move-Item : The given path's format is not supported.
At line:1 char:1
Move-Item -Path 'C:\Program Files (x86)\FieldSmart View\Logs\BgSync.l ...
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
CategoryInfo : NotSpecified: (:) [Move-Item], NotSupportedException
FullyQualifiedErrorId : System.NotSupportedException,Microsoft.PowerShell.Commands.MoveItemCommand
What can I do?
My guess is that you are using Single Quotes not Double Quotes when you want a variable replacement in your string. Your 2nd example has no quotes and the command line will turn that "text" into a string with replacement.
Run this:
"single"
'G:\Team Drives\LGE Prints\Logs\$env:computername.txt'
"double"
"G:\Team Drives\LGE Prints\Logs\$env:computername.txt"
This link looks good, or any other search for variable replacement.
https://kevinmarquette.github.io/2017-01-13-powershell-variable-substitution-in-strings/

Powershell Move-Item from Import-CSV: Error - Could not find part of the path

I've been working on a script in Powershell to get paths from a CSV file and move those files at the corresponding path to a new destination elsewhere. often with a different filename.
I am using Version 5.0
For example:
Source Destination : C:\1\2\3\File.pdf, D:\3\7\8\9\FILE1.pdf
Now I used the following script and it was initially able to move some of the files:
Import-CSV "R:\MoveFiles.csv" -Delimiter "," -ErrorAction Stop | ForEach-Object{Move-Item -path $_.Source -Destination $_.Destination}
Although around half way through executing it started to return this error:
Move-Item : Could not find a part of the path. At line:1 char:238
+ ... Each-Object{Move-Item -Literalpath $.Source -Destination $.Destina ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : WriteError: (Q:\RECORDS\PRIV...-4-20_N1969.pdf:FileInfo) [Move-Item],
DirectoryNotFoundException
+ FullyQualifiedErrorId : MoveFileInfoItemIOError,Microsoft.PowerShell.Commands.MoveItemCommand
As far as I can tell there are no special characters that would prevent the path being found. If I replace Move-Item for Copy-Item it returns the same error. I have also checked the paths to see if they are true or not.
I am at my wits end with this. Not sure what else to try. I am after all a complete novice.
Thank you
NB: I worked out a solution to this issue. It would appear that the Move-Item cmdlet does not like creating directories.
Instead I made the directories first with New-Item -directories, getting the content from a text document where every line represented a path (no headers).
After creating empty directories first the original script worked as intended.
For anyone interested here is the directories script:
#CREATE DIRECTORIES FROM CSV
cd
$name = Get-Content ".\Create_New_Directories\Move_Directories_Test.txt"
Foreach ($_ in $name)
{
New-Item -Force -verbose -path $_ -Type Directory
}
Out-File ".\Create_New_Directories\Newoutput.txt"
Thank you everyone for your help.
To debug such cases, consider Move-Item's -WhatIf parameter. Like so,
... | ForEach-Object{Move-Item -whatif -path $_.Source -Destination $_.Destination}
This will print the intended operation, so you can double-check paths for any sheenigans.
What if: Performing the operation "Move File" on target "Item:
C:\Temp\SomeFile.xml Destination: C:\Temp\Somewhere\SomeFile.xml".
Not sure. But your error message indicates it's a write error of DirectoryNotFound.
So perhaps you should be making sure you have the perms on the target side and are not exceeding any character limits in the length of the path.
Some other things to consider/try:
Your CSV file should be in the format (the first line must be the headers):
Source,Destination
C:\1\2\3\SomeFile.pdf,D:\1\2\3\SomeFile.pdf
C:\1\2\3\SomeFile2.pdf,D:\1\2\3\SomeFile2.pdf
Also you are not santizing your input so if you made the CSV file in Excel you might have leading or trailing spaces. In that case either clean the file editing in Notepad or try $_.Source.trim() and $_.Destination.trim()
And like the other guy said the -whatif switch is useful and so is -verbose.
You might also try Move-Item -Force and/or opening powershell as an Administrator.
Good Luck! ;-)

Misbehaving Get-ChildItem Operation in PowerShell

There is a back-end SQL DB contains "managed folders" in the form of UNC paths. Using SQL queries in PowerShell I have a loop that will work it's way through these folders and run a GCI operation against them to work out how much disk space they are using.
$managedFolder = "\\server\share\folder\subfolder"
For the sake of the question, $managedFolder is declared as above. The failing command below:
$diskTrendsInitialUsage = "{0:N2}" -f ((Get-ChildItem $managedFolder -Recurse -Force | Measure-Object -Property Length -Sum).Sum / 1GB)
Now if I were to run this command manually in PS console it's fine, it pulls back data. But as soon as it's packaged in a script, it fails with the below error. The folder is accessible from the server, as it works fine from a local PS console session.
ERROR: Get-ChildItem : Invalid Path: '\\server\share\folder\subfolder'.
AddManagedFolder.psf (17): ERROR: At Line: 17 char: 42
ERROR: + $diskTrendsInitialUsage = "{0:N2}" -f ((Get-ChildItem $managedFolder -Recurse ...
ERROR: + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
ERROR: + CategoryInfo : NotSpecified: (:) [Get-ChildItem], ArgumentException
ERROR: + FullyQualifiedErrorId : System.ArgumentException,Microsoft.PowerShell.Commands.GetChildItemCommand
ERROR:
I'm stumped.
The problem with your path is that it does not have any indication about which provider to use, so PowerShell just use current one. And if current provider is not a file system provider, then it will fail. So you need to specify provider in path, to allow PowerShell to choose right one regardless of current provider:
$managedFolder = "filesystem::\\server\share\folder\subfolder"
My guess is your are using the SQL PS cmdlets prior to running GCI, this is changing your provider path to SQL: which is what is causing GCI to be unhappy.
Prior to running GCI do cd c:\ to change the path back to the file system and GCI will work.

I need help understanding PowerShell security and file access issues

I'm working with PowerShell, running a script (from my console) that includes this line:
$inpath = "C:\users\xxxxx\path\foo\bar"
and I keep getting this error:
Get-Content : Access to the path 'C:\users\xxxxx\path\foo\bar' is denied.
At C:\users\xxxxx\path\foo\testscript.ps1:53 char:12
+ Get-Content <<<< $txtfile | Get-WordCount -Exclude (Get-Content c:\temp\exclude.txt) | select -First 15
+ CategoryInfo : PermissionDenied: (C:\users\xxxxx\path\foo\bar:String) [Get-Content], UnauthorizedAcc
essException
+ FullyQualifiedErrorId : GetContentReaderUnauthorizedAccessError,Microsoft.PowerShell.Commands.GetContentCommand
The scripts and target files are all located on my local drive. I can access the files in Explorer, view/edit/save them using NotePad, and do not have any permissions restrictions set. When I'm on the command line, I can run the get-content cmdlet successfully on files in my path. I can change directories PS C:> cd C:\users\xxxxx\path\foo\bar and successfully list what's there. Even more interesting, I can duplicate the line that's erroring in the script, and NOT receive an error on the command line.
PS C:\users\xxxxx\path\foo> $inpath = "C:\users\xxxxx\path\foo\bar"
PS C:\users\xxxxx\path\foo>
This makes me suspect that the 'Permission Denied' error is actually something else, or something vague enough that I've got no clue how to proceed with troubleshooting. Is it possible for PS to have different permissions than the user under which it's running? Has anyone seen this behavior before, and how did you solve the problem? I'm sure there's a simple solution that I don't know.
Get-Content : Access to the path 'C:\users\xxxxx\path\foo\bar' is denied.
At C:\users\xxxxx\path\foo\testscript.ps1:53 char:12
That path doesn't look like it is a file but a folder.
Are you sure you are appending the file name to the folder and passing that to Get-Content?
Windows gives Access Denied when you try and open a directory as if it were a file without passing extra flags; and .NET does not pass those flags (there are a few specific circumstances for opening a folder, but they do not apply here).
Get-Content read contents of file not folder. Please add . after your your folder path like below.
Get-Content "D:\Logs\*.*" | ?{($_|Select-String "test")}
If you want to go through all folders way under it then add -recurse like below:
Get-Content "D:\Logs\*.*" -Recurse | ?{($_|Select-String "test")}
Instead of this: (as per your comment)
foreach ($doc in $inpath) { do-function }
try this:
foreach ($doc in (gci $inpath)) { do-function }
You are doing a foreach on a string object instead of your folder items.

Trying to copy a group of files contained in a text file

I'm trying to copy a list of files from a txt file and as a newbie, I'm having a hard time.
Here is a bit of the text file. The real file has no extra lines, but I had to do that to :
"D:\Shared\Customer Care\Customer Care Common\Customers Contracted\Customers Contracted\Fred 44705"
"D:\Shared\Customer Care\Customer Care Common\Customers Contracted\Customers Contracted\Johnson 47227"
"D:\Shared\Customer Care\Customer Care Common\Customers Contracted\Customers Contracted\Daniel 35434"
"D:\Shared\Customer Care\Customer Care Common\Customers Contracted\Customers Contracted\Frank, John 48273"
I've tried enclosing the filename string in double-quotes as well.
Here's the simple script I'm trying to use:
Get-Content c:\users\scripts\files-to-fix.txt | Foreach-Object {copy-item $_ d:\junk}
The error I'm getting is:
Copy-Item : Cannot find drive. A drive with the name ''D' does not
exist. At C:\users\mhyman\scripts\copyfiles.ps1:2 char:81
+ Get-Content c:\users\mhyman\scripts\files-to-fix.txt |
Foreach-Object {copy-item <<<< $_ d:\junk}
+ CategoryInfo : ObjectNotFound: ('D:String) [Copy-Item],
DriveNotFoundException
+ FullyQualifiedErrorId :
DriveNotFound,Microsoft.PowerShell.Commands.CopyItemCommand
I know this is simple, but I would really appreciate some help.
I think it is the surrounding quotes that are causing the problem ( as indicated by the error saying that a drive of name "D is not found. Try this:
get-content c:\users\scripts\files-to-fix.txt | %{ copy-item $_.trim('"') d:\junk}
Of course, if you can control the txt file, enter the list without the quotes.
By your tags and drive letters and backslashes it is clearly a Windows environment your working in and although I'm not a PowerShell scripter, I'm a better than most batch scipter and use a For / If conditioanla statement sicne it is shorter and you feed it your file instead of parsing out the file into reduudc commands on a line, so in your example:
for /F %%t in (the text file.txt) do copy /q %%t d:\junk
And then you go home and never worry about until the next morning
Does powershell have a runas ornative mode that can parse older, more proven and stable DOS commands ?