powershell compress-archive File size issue - powershell

I am having a issue with the Compress-Archive Command in powershell. It seems once the file size of the directory it pulls from is over 20 or so GB it returns an error.
Compress-Archive -Path Z:\from\* -CompressionLevel Optimal -DestinationPath Z:\To\test.zip
If the folder size of from is under 20GB in size, this command works fine. If it is greater than 20GB in size I get the following error
Remove-Item : Cannot find path 'Z:\To\test.Zip' because it does not
exist.
Test-Path : The specified wildcard character pattern is not valid:
Is there a limit on this that is just not notated on Microsoft site?
Note: I am on windows 10

I suggest you to use powershell and call a program that is more performant to zip like 7zip, winrar or others. You could probably achive a better result with big file.
You could refer to this post for alternatives :
How to create a zip archive with PowerShell?

Related

How to use Powershell Compress-Archive to compress only pdfs in a folder?

I'm working on an Alteryx workflow which creates multiple pdfs and outputs them to a directory. I need to call a Powershell script at the end of the workflow to compress the pdfs into a zip file and save it in the same location.
I found this:
PowerShell Compress-Archive By File Extension, but it requires copying the files to another location and then compressing them. Is there a way I can just compress all pdfs in a certain folder location and output the zip file in the same location? Due to restrictions in Alteryx I might not be able to work with the output if it is in a different location.
My current Powershell script:
Compress-Archive -LiteralPath 'D:\temp\test zipping flow\' -DestinationPath 'D:\temp\final.zip' -Force
This works perfectly, but tries to zip files with other extensions as well but I only want .pdf files.
Any suggestions are greatly appreciated.
Instead of -LiteralPath use -Path, so you can add a wildcard character *
Compress-Archive -Path 'D:\temp\test zipping flow\*.pdf' -DestinationPath 'D:\temp\final.zip' -Force
As alternative for when you need to use -LiteralPath (perhaps because the path contains characters that would be interpreted using -Path like square brackets), you could do this instead:
Get-ChildItem -LiteralPath 'D:\temp\test zipping flow' -Filter '*.pdf' -File |
Compress-Archive -DestinationPath 'D:\temp\final.zip' -Force

Copy-Item command of powershell is not working for large files

I am using Copy-Item of powershell to copy files from source to destination. Below is the command which I am using.
Copy-Item -Path $fpath -Destination D:\abc\copy_location
$fpath being $fpath = $Event.SourceEventArgs.FullPath
While testing I found that it was able to copy small files but it wasn't able to copy large files (~300 - 400 MB). I have files max of around 400 MB which I have to copy from source to destination. I saw a post in stackoverflow - "Copy-Item fails on large file" where it said to use doubt quotes in the path ... i tried that as well but no success.
Please kindly advise what to do? The other option which I understand is to use robocopy command. The command being:
robocopy source destination file_to_copy
Here I am facing one issue my source is $fpath, but it gives me the full path upto the file. I only want the path upto the folder.
Still hit the same problem a few years later, when copying large files from Linux to Windows using PowerShell. Smaller filers from the same folder were copied successfully, but failed on larger files. Found a solution at the end of this https://github.com/powershell/powershell/issues/4952
Basically, you need to set MaxEnvelopeSizekb to a bigger value, 1000 worked for me for files over 6MB.

Can I use PowerShell `Expand-Archive` upon a zip file with no extension

I'm working upon a PowerShell script, where I have to extract the content out of a .zip archive which extension is removed, so, archive's name is let's say not test.zip but just test, and it is compressed as a .zip archive.
I'm trying to use for this purpose the PowerShell cmdlet Expand-Archive like shown below :
Expand-Archive -LiteralPath "Path to the archive" -DestinationPath "Extraction Path"
But, it doesn't seem to work, is there a possibility of extracting this archive's content with powershell, or it would be better to use a work around like 7zip command line tools, or something similar?
The Expand-Archive cmdlet is designed to explicitly work with a path that has a .zip extension. You can work around this by either creating a copy of your archive with a proper extension using Copy-Item or renaming the archive to have an extension with Rename-Item (using Move-Item may be more desirable if the archive with extension already exists and you want to overwrite it; Rename-Item is not capable of overwriting).

Powershell 5.0 Compress-archive creates empty file duplicates of some folders

Long story short, I have a powershell script which compresses several folders into zip-files.
In order to compress a single directory into a zip file, I use this command:
Compress-Archive -Path $SourcePath -DestinationPath $OutputPath -CompressionLevel Optimal
Where $SourcePath is an absolute path ending on *, e.g. C:\Build\Output*, and $OutputPath is an absolute path ending on .zip, e.g. C:\Build\Debug.zip.
There are a lot of files and folders in the source path.
The issue I experience is that, scattered around the zip file, folders have a duplicate empty file. This causes problems when trying unzip the archive with e.g. 7-zip.
Interestingly enough, I do not see this issue with the build-in unzip in Total Commander.
I am wondering if this is an issue with the Powershell command, or 7-zip?

Powershell: 'The fully qualified file name must be less than 260 characters'

I tried to use powershell command copy-item as xcopyto copy content of one disk to another one.
copy-item -Path h:\* -Destination g:\ -Recurse -Force
However, I encountered the following errors:
Copy-Item : The specified path, file name, or both are too long. The
fully qualified file name must be less than 260 characters, and the
directory name must be less than 248 characters.
I got these errors enough to discourage manually search and copy files or folders with long paths. What is the best way to avoid this problem?
As far as I know robocopy deals with this automatically (at least you would have to disable support for long paths explicitly). So you could use
robocopy h:\ g:\ /E
if you're not too adverse to a native command instead of a pure PowerShell solution.
Usually you can prepend \\?\ to a path to allow handling paths with up to 32k characters but it could well be that this might not help with .NET.
On Codeplex Microsoft hosts an experimental long path wrapper which provides functionality to make it easier to work with paths that are longer than the current 259 character limit of the System.IO namespace.
An example of how to copy a file using this wrapper in Powershell:
[reflection.assembly]::loadfile("C:\Users\stackoverflow\Desktop\Microsoft.Experimental.IO.dll")
[microsoft.experimental.io.longpathfile]::Copy((gi .\myversion.txt).fullname, "C:\users\stackoverflow\desktop\aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa",$true)
Other samples can be found here.