Copy-Item command of powershell is not working for large files - robocopy

I am using Copy-Item of powershell to copy files from source to destination. Below is the command which I am using.
Copy-Item -Path $fpath -Destination D:\abc\copy_location
$fpath being $fpath = $Event.SourceEventArgs.FullPath
While testing I found that it was able to copy small files but it wasn't able to copy large files (~300 - 400 MB). I have files max of around 400 MB which I have to copy from source to destination. I saw a post in stackoverflow - "Copy-Item fails on large file" where it said to use doubt quotes in the path ... i tried that as well but no success.
Please kindly advise what to do? The other option which I understand is to use robocopy command. The command being:
robocopy source destination file_to_copy
Here I am facing one issue my source is $fpath, but it gives me the full path upto the file. I only want the path upto the folder.

Still hit the same problem a few years later, when copying large files from Linux to Windows using PowerShell. Smaller filers from the same folder were copied successfully, but failed on larger files. Found a solution at the end of this https://github.com/powershell/powershell/issues/4952
Basically, you need to set MaxEnvelopeSizekb to a bigger value, 1000 worked for me for files over 6MB.

Related

Powershell move-item does not remove some 3D png files from source

I have a Powershell script issuing the Move-Item command to move some 3D CAD directories and subdirectories from one network share to another. About 300 parent folders a day. I am having the most strange problem. All parent folders are copying to the destination just fine, but about 10 of the 300 objects are not being removed from the source. It seems that any subdirectory that contains a type of png scan file, will not remove from the source; it copies fine, just doesn't remove from source. No error is reported in the -verbose log output. The file sizes are not large, < 50MB each. Doing some testing, I also noticed that if I run the script once, the problem occurs, but if I run the script again with no changes, it moves the remaining objects that it did not move on the initial run. Another observation is that if I run the script where the source and destination are on the same share, the problem does not occur. The problem only occurs when source and destination are on different shares. Same problem using both versions 5.1 and 4.0.
I also tried using some simple png files I created myself just by saving a jpg screen print as a png, and those png files copy and remove fine. So something is special about these 3D CAD png files. I even tried copying off the parent folder to share to make sure no one or no program is locking the original files, and the problem still happens.
I have been successful at replacing the move-item command with a robocopy command, and it works perfectly, so I could use robocopy to solve my problem, but I want to figure out why the move-item command is having this problem. My only theory is that if on the same share, no data is actually being moved, only the reference to the file, ie pointer, ie FAT (file allocation table) is being changed. When crossing shares the actual file data has to be moved, so something is funny about these files that prevents that from happening. But as stated above, if I run the script a 2nd time, the remaining files copy and remove fine. So that leaves a little unexplained.
Just wondering if anyone has any ideas.
See the code samples below, and also some screen prints of the source folder structure before and after.
$sLogPath = 'C:\PowerShell_scripts\test\robocopy_test.log'
$StagingFolder = '\\3ipbg-fs-p01\PSP_Inbound\Test\3Shape_Auto_Create\Suspense\test_case_01'
$FinalFolder_M = '\\3ipbg-fs-p01\patient specific mfg\3Shape_AutoCreate\Auto_Job_Create_Test\Inbox_Manual_Jobs'
$FinalFolder_M_robo = '\\3ipbg-fs-p01\patient specific mfg\3Shape_AutoCreate\Auto_Job_Create_Test\Inbox_Manual_Jobs\test_case_01'
Move-Item -LiteralPath $StagingFolder $FinalFolder_M -Verbose -Force
robocopy $StagingFolder $FinalFolder_M_robo /R:3 /W:3 /E /B /J /MIR /MOVE /LOG+:$sLogPath
enter image description here
You are using /R:3 parameter for robocopy, meaning it retries to move the folder up to 3 times if an error is present.
I am not aware of such functionality in PowerShell, but you could easily write a small utility function for this.
Assumptions:
1) In the loop code, it checks if the source folder exists, if Yes, it continues up to $retries times with Sleep of 2 seconds.
Function Custom-Move{
param( [string]$source, [string]$destination, [int]$retries)
$i=0
while($i -ne $retries){
Move-Item -LiteralPath $source -Destination $destination -Verbose -Force -ErrorAction SilentlyContinue
#test if the folder exist, if not meaning that the folder was moved.
if(-not (Test-Path $source)) {break}
$i++
Start-Sleep 2
}
}
Usage:
Custom-Move $StagingFolder $FinalFolder_M 5

Powershell: Multipart RAR extraction

I have a folder with n number of rar files with some having .part01,.part02 as they are divided. the code i am currently using is given below.
$Rars = Get-ChildItem -path 'c:\demopath' -filter "*.rar"
$Destination = 'c:\demopath'
$WinRar = "C:\Program Files\WinRAR\WinRAR.exe"
foreach ($rar in $Rars)
{
&$Winrar x -y $rar.FullName $Destination
Get-Process winrar | Wait-Process
}
This code is running many times to extract same file again and again by the number of parts it has. example if the file is having 3 parts it will get extract same file 3 times (overwriting the previously extracted file). for single rar file there is no issue. if i only give "x" then it is giving popups for file already exists. need a solution which will not give any popups and extract only if same name file is not there in the directory.
can someone help me to fix this issue?
solved the issue with using unrar freeware in winrar directory.
all i needed was to use the "-o-" parameter to stop overwriting the existing extracted file. same command will work with winrar.exe as well
&$UnRAR x -o- $rar.FullName $Destination

Powershell 5.0 Compress-archive creates empty file duplicates of some folders

Long story short, I have a powershell script which compresses several folders into zip-files.
In order to compress a single directory into a zip file, I use this command:
Compress-Archive -Path $SourcePath -DestinationPath $OutputPath -CompressionLevel Optimal
Where $SourcePath is an absolute path ending on *, e.g. C:\Build\Output*, and $OutputPath is an absolute path ending on .zip, e.g. C:\Build\Debug.zip.
There are a lot of files and folders in the source path.
The issue I experience is that, scattered around the zip file, folders have a duplicate empty file. This causes problems when trying unzip the archive with e.g. 7-zip.
Interestingly enough, I do not see this issue with the build-in unzip in Total Commander.
I am wondering if this is an issue with the Powershell command, or 7-zip?

Why wont Shopify accept my theme's zip file when compressed using Powershell

I'm trying to create a workflow for designing multiple Shopify themes based on the same theme. I have written a simple PowerShell script that collects the necessary files and compresses them into a zip file.
Here is a simplified version of my script:
# Copy all files from the base theme to my temporary folder
Copy-Item $baseTheme"*" $tempFolder -recurse -force -exclude ".git"
# Include the files specific to the current theme
Copy-Item $specificTheme"assets" $tempFolder -recurse -force
Copy-Item $specificTheme"config" $tempFolder -recurse -force
Copy-Item $specificTheme"layout" $tempFolder -recurse -force
Copy-Item $specificTheme"snippets" $tempFolder -recurse -force
Copy-Item $specificTheme"templates" $tempFolder -recurse -force
# Compress the temporary folder
Compress-Archive $tempFolder $zipFileName
When I manually perform these steps, and create the zip file in Windows using Send To > Compressed (zipped) folder, Shopify is completely happy with the zip file.
However, when I upload the result of this script it gives me the following error:
There was 1 error:
zip does not contain a valid theme: missing template "layout/theme.liquid", missing template "templates/index.liquid", missing template "templates/collection.liquid", missing template "templates/product.liquid", missing template "templates/page.liquid", missing template "templates/cart.liquid", and missing template "templates/blog.liquid"
I've double and triple checked the zip file, and all of the required files exist. I've messed around with the -CompressionLevel of Compress-Archive. I've tried other methods of zipping folders within PowerShell. All with no luck.
I really can't see any difference between the results of the script and compressing manually. Any ideas?
I am answering my own question, but I'm not the one that solved it. I posted a link to this question on the shopify forum and someone named Adrian posted the following answer.
Short version. Download 7-ZIP and use the test archive facility. Zips
created with compress-archive show no folders whereas those created
with the Send To zip in Windows do. The folders do exist inside the
actual archive though so I'd say the header info is malformed.
Looks like there is a structural error within the files that
compress-archive creates that Windows is happy with but the, probably
Unix-based, Shopify servers don't accept.
I installed 7-Zip and used the Test Archive feature to evaluate both folders. Here is the (truncated) output:
Compressed with Compress-Archive
Archives: 1
Files: 77
There are no errors.
Compressed with Send-To Zip context menu
Archives: 1
Folders: 6
Files: 76
There are no errors.
At this point, I'm just happy knowing what's going on. But in order to really call this problem solved I updated my PowerShell script to use 7-Zip for compressing the folder.
I replaced this:
# Compress the temporary folder
Compress-Archive $tempFolder $zipFileName
With this:
# Compress the temporary folder
& "c:\Program Files\7-Zip\7z.exe" a -tzip $zipFile $tempFolder
I uploaded the archive to Shopify and everything worked just fine.
Now, I'm really not that excited about installing and running third party software to do what (I believe) should be a basic OS task. I've reported this issue to Microsoft here. Who knows, maybe they'll fix it.
This looks like an issue with the folder structure inside the archive file.
If you inspect the archive file created via PowerShell, does it show the same structure as the file created manually ?
This link shows the expected folder structure for a Shopify Theme:
index.liquid
product.liquid
collection.liquid
cart.liquid
blog.liquid
article.liquid
page.liquid
list_collections.liquid
search.liquid
404.liquid
gift_cards.liquid
customers/account.liquid
customers/activate.liquid
customers/addresses.liquid
customers/login.liquid
customers/order.liquid
customers/register.liquid
customers/reset_password.liquid
password.liquid
settings_schema.json
theme.liquid (layout file)

powershell compress-archive File size issue

I am having a issue with the Compress-Archive Command in powershell. It seems once the file size of the directory it pulls from is over 20 or so GB it returns an error.
Compress-Archive -Path Z:\from\* -CompressionLevel Optimal -DestinationPath Z:\To\test.zip
If the folder size of from is under 20GB in size, this command works fine. If it is greater than 20GB in size I get the following error
Remove-Item : Cannot find path 'Z:\To\test.Zip' because it does not
exist.
Test-Path : The specified wildcard character pattern is not valid:
Is there a limit on this that is just not notated on Microsoft site?
Note: I am on windows 10
I suggest you to use powershell and call a program that is more performant to zip like 7zip, winrar or others. You could probably achive a better result with big file.
You could refer to this post for alternatives :
How to create a zip archive with PowerShell?