I wrote a simple powershell script that takes a set of files and compresses them into a zip file using the ZipFile .NET Framework class. What I'd like to do is to verify that the file compressed without issues.
I can create a hash value for the zip file itself, but I'm unsure as to how to do this with each individual file in the the archive or compare each uncompressed file to the compressed version of the file. Here's the compression piece of my script.
$FileList | ForEach-Object -Begin {
$Zip = [System.IO.Compression.ZipFile]::Open("Destination.Zip","Create")
} -Process {
[System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($Zip,$_.FullName,$_.Name,"optimal")
} -End {
$Zip.Dispose()
}
I know the compression piece works, however the eventual goal is to verify each file and redo that file if the verification fails. Afterwards delete the uncompressed files.
The system this is to run on only has powershell v3 and no third party compression tools are installed. I'd like to stick with that if possible.
I guess your most direct shot would be to use 7-Zip.
You can use native commands in PowerShell.
You can find an example use of the -t switch here.
Get the CLI, add 7-Zip folder to your path and run for example:
7z t archive.zip
Related
I am trying to execute an .exe executable file (let' say it is called myfile.exe) under the argument (argument.fst) . Both files have the same name for each execution, but are located in different subfolders in the same parent directory.
My objective is to create a for-loop, in which, I will pinpoint the paths to both files (14 groups in total, so 14 loops) and then Windows Powershell will execute those. My goal is to automate my simulations, ran by the .exe files+arguments, thus saving time.
Is my thought possible to be implemented on Windows Powershell?
Thank you very much,
Ioannis Voultsos.
If you want to automate the process, you may store your command,args in csv file (i.e. commands.csv):
command;arguments
myapp.exe;c:/
myapp.exe;h:/
then load it and execute using &:
$csv=(import-csv commands.csv -delimiter ';')
$csv|foreach{ &$_.command $.arguments }
Beware of executing commands from strings, coming from untrusted sources though.
Try out this sample code on the parent folder
Get-ChildItem | Where-Object {($_.psiscontainer)} | ForEach-Object { cd $_.FullName; & ".\SampleApp.exe args0 args1"; cd.. }
it will go into each directory and execute .exe in each folder with arguments.
Pretty simple question I guess...
For this example, I have a directory with 3 files called L1.rph, L2.rph, and L3.rph and one executable called convert.exe
If I manually drag and drop each individual filename.rph file into the executable, it creates a filename.csv, however if I select more than one, it will only convert one.
I know there's got to be a way to do a for loop that will "emulate" me dragging and dropping all those .rph files in that directory to the executable and create all those .csv that I need.
Sorry...newbie with scripts, it probably would have been easier for me in Linux Shell but I have this exe in Windows...so I'm stuck.
I need this to run in Windows PowerShell
You can pass those exe files as arguments:
Get-ChildItem "Path" -Filter *.rph | ForEach {& "exefilepath" $_.FullName}
Drag n Drop is just same as passing arguments.
I'm writing a PowerShell code to copy files in a folder to another folder. I want the console to display the files that are being copied, as well as the file size if possible.
Currently, I have tried using -Verbose But the output is not very readable.
I would like the console to display the files being copied, and the file size.
You can use the parameter -PassThru for Copy-Item. But it will not show you the file size.
I would recommend you to use robocopy.exe for any copy jobs in powershell. It is more reilable and in your case it will show you the filesize.
In a Powershell script, I've created zip file archives using functions like
[io.compression.zipfile]::CreateFromDirectory.
Now these archives are getting large, and I need to break them down to files that are under 5GB. So, I've been looking through some of MS API documents on file compression looking for some type of disk spanning feature, or making archives spread out over multiple files.
Does anyone know of a .Net or Powershell cmdlet that can do this?
Thanks!
I guess you've already read about the file size limitation:
powershell compress-archive File size issue
zip file size in powershell
Compress-Archive
Because Compress-Archive relies upon the Microsoft .NET Framework API System.IO.Compression.ZipArchive to compress files, the maximum file size that you can compress by using Compress-Archive is currently 2 GB. This is a limitation of the underlying API.
May be you could use 7zip's volume option /v4GB
In my powershell script I'm unzipping a "filename.zip" file, once done I want to cross check if the number of files I unzipped and placed in a folder is equal to the number of files in the "filename.zip". Is there any way to do that? Because sometimes few files are being missed out.
Note: I'm using powershell version 2.0
$zip_file="D:\speter071714\OMS.DODU.Qa.$version_no.zip"
$DODU_folder="D:\speter071714\DODU\"
echo "Extracting the DODU.zip file"
$shell = New-Object -ComObject shell.application
$zip1=$shell.NameSpace("$zip_file")
foreach ($item1 in $zip1.items())
{
$shell.Namespace("$DODU_folder").CopyHere($item1)
}
I basically want to compare the content in $zip_file="D:\speter071714\OMS.DODU.Qa.$version_no.zip"
and $DODU_folder="D:\speter071714\DODU\"
If a few files are missing it means that your unzipping method is not very good. See here if any other method for you is better: How to read contents of a csv file inside zip file using PowerShell
But regarding your way, you already have all the items inside zip file in $zip1.items(), so you could check Name property of each of the items and then see if the file exists in the output folder.