In a Powershell script, I've created zip file archives using functions like
[io.compression.zipfile]::CreateFromDirectory.
Now these archives are getting large, and I need to break them down to files that are under 5GB. So, I've been looking through some of MS API documents on file compression looking for some type of disk spanning feature, or making archives spread out over multiple files.
Does anyone know of a .Net or Powershell cmdlet that can do this?
Thanks!
I guess you've already read about the file size limitation:
powershell compress-archive File size issue
zip file size in powershell
Compress-Archive
Because Compress-Archive relies upon the Microsoft .NET Framework API System.IO.Compression.ZipArchive to compress files, the maximum file size that you can compress by using Compress-Archive is currently 2 GB. This is a limitation of the underlying API.
May be you could use 7zip's volume option /v4GB
Related
I'm writing a PowerShell code to copy files in a folder to another folder. I want the console to display the files that are being copied, as well as the file size if possible.
Currently, I have tried using -Verbose But the output is not very readable.
I would like the console to display the files being copied, and the file size.
You can use the parameter -PassThru for Copy-Item. But it will not show you the file size.
I would recommend you to use robocopy.exe for any copy jobs in powershell. It is more reilable and in your case it will show you the filesize.
Running the follow command to zip all txt file:
Compress-Archive -Path "$testfolder\*.txt\" -CompressionLevel Optimal -DestinationPath $textfolder\TESTZIP
I created a scheduled task that will run every 5 minutes for a period of 1 hour. Since this is a test, files get created every 5 minutes as well. But my zip folder does not get updated.
How could I update my zip folder based on my command on top?
After 1 hour, email alerts gets sent out. I have the email settings set up.
When in doubt, read the documentation.
-Update
Updates the specified archive by replacing older versions of files in the archive with newer versions of files that have the same names. You can also add this parameter to add files to an existing archive.
Add the parameter -Update to your commandline.
To overwrite an existing zip file use the -Force argument.
I have a script that I run at my work that uses get-childitem to get all the files of a certain type in a storage drive and sorts and moves them to an archive drive. I'd like to automate this process to run once everyday but I realized I would have a problem in doing so.
Occasionally, when this script is run a file or two will still be in the process of transferring over to our storage drive. If I let the script move this file while it is still being transferred from our customer, it gets corrupted and won't open later.
I know how to filter based on file type and date and other basic parameters, but I'm not entirely sure how I tell this script to exclude files that are currently growing in size.
Below is what I'm currently using to filter what I want to move:
$TargetType = "*.SomeFileType"
$TargetDrive = "\\Some\UNC\Path"
Get-ChildItem $targetdrive\$targettype | ForEach-Object {$_.fullname} | Sort-Object | out-file $outStorageMove
Also, at the moment I'm putting everything that get-childitem finds into a text file, that gets invoked later so that I can manually edit what I want it to move. I'd like to get rid of this step if at all possible.
So, move is essentially copy and delete.
So, like gvee state, Copy-Item is a better option, to get you past your stated concern, monitor for the copy to complete. My addition would be to delete once the copy is done and you have verified the copy.
Or use Bits as a job to do this.
Using Windows PowerShell to Create BITS Transfer Jobs
https://msdn.microsoft.com/en-us/library/windows/desktop/ee663885(v=vs.85).aspx
You can use PowerShell cmdlets to create synchronous and asynchronous Background Intelligent Transfer Service (BITS) transfer jobs.
All of the examples in this topic use the Start-BitsTransfer cmdlet. To use the cmdlet, be sure to import the module first. To install the module, run the following command: Import-Module BitsTransfer. For more information, type Get-Help Start-BitsTransfer at the PowerShell prompt.
I am attempting to create a zip from from a folder using powershell 5 on Windows 10. After looking at this stackoverflow post I am trying out the Compress-Archive method.
When I Type in:
Compress-Archive -Path 'C:\Users\Test\demo' -DestinationPath 'C:\Users\Test\demo.zip' -verbose
I get a verbose error saying:
VERBOSE: The partially created archive file 'C:\Users\Test\demo.zip' is deleted as it is not usable.
I've looked everywhere online and I'm not able to find a solution to this. Anyone know whats going on?
This message can appear if the folder you are compressing is empty.
I had this same problem,
My path directory was holding a *, it was like
d:\somedirectory\*, I removed the * and it worked when it looked like d:\somedirectory
So my take is that you need to be extra careful when it comes to the path.
I wrote a simple powershell script that takes a set of files and compresses them into a zip file using the ZipFile .NET Framework class. What I'd like to do is to verify that the file compressed without issues.
I can create a hash value for the zip file itself, but I'm unsure as to how to do this with each individual file in the the archive or compare each uncompressed file to the compressed version of the file. Here's the compression piece of my script.
$FileList | ForEach-Object -Begin {
$Zip = [System.IO.Compression.ZipFile]::Open("Destination.Zip","Create")
} -Process {
[System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($Zip,$_.FullName,$_.Name,"optimal")
} -End {
$Zip.Dispose()
}
I know the compression piece works, however the eventual goal is to verify each file and redo that file if the verification fails. Afterwards delete the uncompressed files.
The system this is to run on only has powershell v3 and no third party compression tools are installed. I'd like to stick with that if possible.
I guess your most direct shot would be to use 7-Zip.
You can use native commands in PowerShell.
You can find an example use of the -t switch here.
Get the CLI, add 7-Zip folder to your path and run for example:
7z t archive.zip