Add-Type -A System.IO.Compression.FileSystem
[IO.Compression.ZipFile]::CreateFromDirectory('foo', 'foo.zip')
[IO.Compression.ZipFile]::ExtractToDirectory('foo.zip', 'bar')
I found the code to create and extract .zip files via PowerShell from this answer, but because of my low reputation I cannot ask a question as a comment on that answer.
Creating - How to overwrite an existing .zip file without user interaction?
Extracting - How to overwrite existing files and folders without user interaction? (Preferably like robocopys mir function).
PowerShell has built-in .zip utilities without needing to use .NET class methods in version 5 and above. The Compress-Archive -Path argument also takes a string[] type so you can zip multiple folders/files into the destination zip.
Zipping:
Compress-Archive -Path C:\Foo -DestinationPath C:\Foo.zip -CompressionLevel Optimal -Force
There is also an -Update switch.
Unzipping:
Expand-Archive -Path C:\Foo.zip -DestinationPath C:\Foo -Force
PowerShell versions prior to 5 can execute this script
Thanks to #Ola-M for update.
Thanks to #maximilian-burszley for update.
function Unzip($zipfile, $outdir)
{
Add-Type -AssemblyName System.IO.Compression.FileSystem
$archive = [System.IO.Compression.ZipFile]::OpenRead($zipfile)
try
{
foreach ($entry in $archive.Entries)
{
$entryTargetFilePath = [System.IO.Path]::Combine($outdir, $entry.FullName)
$entryDir = [System.IO.Path]::GetDirectoryName($entryTargetFilePath)
#Ensure the directory of the archive entry exists
if(!(Test-Path $entryDir )){
New-Item -ItemType Directory -Path $entryDir | Out-Null
}
#If the entry is not a directory entry, then extract entry
if(!$entryTargetFilePath.EndsWith("\")){
[System.IO.Compression.ZipFileExtensions]::ExtractToFile($entry, $entryTargetFilePath, $true);
}
}
}
finally
{
$archive.Dispose()
}
}
Unzip -zipfile "$zip" -outdir "$dir"
Related
I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force
I have a simple folder structure that I need to zip up to upload to AWS Lambda function
node-modules/
index.js
package.json
The files above are in the root directory. Basically the end goal is I would like a zip of all of these files and sub-files/directories instead of the structure as it is.
When I try to run the command below it says it can't access E://apps/myapp/release.zip because it's being used by another process. However I can see it starts to create release.zip but it does not have all the contents.
Add-Type -Assembly "System.IO.Compression.FileSystem";
[System.IO.Compression.ZipFile]::CreateFromDirectory(E://apps/myapp, E://apps/myapp/release.zip);
So I tried another approach. Take the folder and two files and copy them into a temporary folder, then try to zip it back into the root.
Copy-Item E://apps/myapp E://apps/myapp/temp -recurse
I do see a temp/ folder but within the temp folder it's like an inception of never ending copies until the file path gets too long.
Any tips would be much appreciated.
Issue could be that you are creating the zip file in the same folder that you are trying to compress. And you basically do the same thing when you tried using the temporary folder, hence the inception.
Try creating destination outside of the source folder being compressed.
$source = "E://apps/myapp"
$destination = "E://apps/myapp.release.zip"
# If the archive already exists, an IOException exception is thrown.
if(Test-Path $destination) {
Remove-Item -Path $destination -Force -Recurse -Confirm:$false
}
[Reflection.Assembly]::LoadWithPartialName( "System.IO.Compression.FileSystem" )
[System.IO.Compression.ZipFile]::CreateFromDirectory($source, $destination)
# Once archive created delete folder
if(Test-Path $destination) {
Remove-Item -Path $source -Force -Recurse -Confirm:$false
Write-Host "directory removed: $source"
}
I am trying to copy/move one text file to the zip. I don't want to unzip it, copy the file and zip it back. Is there any way we can directly copy or move text file to the zip in powershell. When i am doing it in powershell, it's doing after that when i try to look inside the zip it's saying invalid path.
Powershell commands:
$A = "20160914.4"
New-Item "C:\test\VersionLabel.txt" -ItemType file
$A | Set-Content "C:\test\VersionLabel.txt"
Copy-Item "C:\test\VersionLabel.txt" "C:\test\Abc.zip" -Force
Error: The compressed folder is invalid
>= PowerShell 5.0
Per #SonnyPuijk's answer above, use Compress-Archive.
clear-host
[string]$zipFN = 'c:\temp\myZipFile.zip'
[string]$fileToZip = 'c:\temp\myTestFile.dat'
Compress-Archive -Path $fileToZip -Update -DestinationPath $zipFN
< PowerShell 5.0
To add a single file to an existing zip:
clear-host
Add-Type -assembly 'System.IO.Compression'
Add-Type -assembly 'System.IO.Compression.FileSystem'
[string]$zipFN = 'c:\temp\myZipFile.zip'
[string]$fileToZip = 'c:\temp\myTestFile.dat'
[System.IO.Compression.ZipArchive]$ZipFile = [System.IO.Compression.ZipFile]::Open($zipFN, ([System.IO.Compression.ZipArchiveMode]::Update))
[System.IO.Compression.ZipFileExtensions]::CreateEntryFromFile($ZipFile, $fileToZip, (Split-Path $fileToZip -Leaf))
$ZipFile.Dispose()
To create a single file zip file from scratch:
Same as above, only replace: [System.IO.Compression.ZipArchiveMode]::Update
With: [System.IO.Compression.ZipArchiveMode]::Create
Related Documentation:
Compress-Archive: https://technet.microsoft.com/en-us/library/dn841358.aspx
CreateEntryFromFile: https://msdn.microsoft.com/en-us/library/hh485720(v=vs.110).aspx
You can use Compress-Archive for this. Copy-Item doesn't support zip files.
If you don't have PowerShell v5 you can use either 7Zip command line or .Net
I'm trying to copy a file to a new location, maintaining directory structure.
$source = "c:\some\path\to\a\file.txt"
destination = "c:\a\more\different\path\to\the\file.txt"
Copy-Item $source $destination -Force -Recurse
But I get a DirectoryNotFoundException:
Copy-Item : Could not find a part of the path 'c:\a\more\different\path\to\the\file.txt'
The -recurse option only creates a destination folder structure if the source is a directory. When the source is a file, Copy-Item expects the destination to be a file or directory that already exists. Here are a couple ways you can work around that.
Option 1: Copy directories instead of files
$source = "c:\some\path\to\a\dir"; $destination = "c:\a\different\dir"
# No -force is required here, -recurse alone will do
Copy-Item $source $destination -Recurse
Option 2: 'Touch' the file first and then overwrite it
$source = "c:\some\path\to\a\file.txt"; $destination = "c:\a\different\file.txt"
# Create the folder structure and empty destination file, similar to
# the Unix 'touch' command
New-Item -ItemType File -Path $destination -Force
Copy-Item $source $destination -Force
Alternatively, with PS3.0 onwards, you can simply use the New-Item to create the target folder directly, without having to create a "dummy" file, e.g. ...
New-Item -Type dir \\target\1\2\3\4\5
...will happily create the \\target\1\2\3\4\5 structure irrespective of how much of it already exists.
Here's a oneliner to do this. Split-Path retrieves the parent folder, New-Item creates it and then Copy-Item copies the file. Please note that the destination file will have the same filename as the source file. Also, this won't work if you need to copy multiple files to the same folder as with the second file you'll get An item with the specified name <destination direcory name> already exists error.
Copy-Item $source -Destination (New-Item -Path (Split-Path -Path $destination) -Type Directory)
I had files in a single folder in Windows 7 that I wanted to rename and copy to nonexistent folders.
I used the following PowerShell script, which defines a Copy-New-Item function as a wrapper for the Test-Item, New-Item, and Copy-Item cmdlets:
function Copy-New-Item {
$SourceFilePath = $args[0]
$DestinationFilePath = $args[1]
If (-not (Test-Path $DestinationFilePath)) {
New-Item -ItemType File -Path $DestinationFilePath -Force
}
Copy-Item -Path $SourceFilePath -Destination $DestinationFilePath
}
Copy-New-Item schema_mml3_mathml3_rnc schema\mml3\mathml3.rnc
# More of the same...
Copy-New-Item schema_svg11_svg_animation_rnc schema\svg11\svg-animation.rnc
# More of the same...
Copy-New-Item schema_html5_assertions_sch schema\html5\assertions.sch
# More of the same...
(Note that, in this case, the source file names have no file extension.)
If the destination file path does not exist, the function creates an empty file in that path, forcing the creation of any nonexistent directories in the file path. (If Copy-Item can do all that by itself, I could not see how to do it from the documentation.)
It is coming late, but as I stumbled upon this question looking for a solution to a similar problem, the cleanest one I found elsewhere is using robocopy instead of Copy-Item. I needed to copy the whole file structure together with the files, that's easily achieved via
robocopy "sourcefolder" "destinationfolder" "file.txt" /s
Detail about robocopy: https://learn.microsoft.com/en-us/windows-server/administration/windows-commands/robocopy
None of the current answers worked for me to fix the Could not find a part of the path error raised by Copy-Item. After some research and testing, I discovered this error can be raised if the Destination path goes over the 260 character Windows path length limit.
What I mean by that is: if you supply a path to the Destination argument of Copy-Item and any of the files you are copying would exceed the 260 character limit when copied to the Destination folder, Copy-Item will raise the Could not find a part of the path error.
The fix is to shorten your Destination path, or to shorten/flatten the folder structure in the source directory that you are trying to copy.
May be Helpfull:
$source = 'c:\some\path\to\a\file.txt'
$dest = 'c:\a\more\different\path\to\the\file.txt'
$dest_dir = 'c:\a\more\different\path\to\the\'
[System.IO.Directory]::CreateDirectory($dest_dir);
if(-not [System.IO.File]::Exists($dest))
{
[System.IO.File]::Copy($source,$dest);
}
I have been digging around and found a lot of solutions to this issue, all being some alteration not just a straight copy-item command. Grant it some of these questions predate PS 3.0 so the answers are not wrong but using powershell 3.0 I was finally able to accomplish this using the -Container switch for copy-item.
Copy-Item $from $to -Recurse -Container
this was the test i ran, no errors and destination folder represented the same folder structure.
New-Item -ItemType dir -Name test_copy
New-Item -ItemType dir -Name test_copy\folder1
New-Item -ItemType file -Name test_copy\folder1\test.txt
#NOTE: with no \ at the end of the destination the file is created in the root of the destination, does not create the folder1 container
#Copy-Item D:\tmp\test_copy\* D:\tmp\test_copy2 -Recurse -Container
#if the destination does not exists this created the matching folder structure and file with no errors
Copy-Item D:\tmp\test_copy\* D:\tmp\test_copy2\ -Recurse -Container
I have two directories: c:\Rar and c:\unRared
Rar - contains hundreds of RAR'ed files. Only one file inside the RAR. File inside the archive is with *.TRN extension.
UnRared has unarchived files (hundreds of files with *.TRN extension)
I've been trying to create a Powershell script to extract only files that have not been extracted already.
can't you just supply parameters to whatever compression program you have telling it not to overwrite existing files?
Knocked this out on my own...well, not really, with the help of developer
$dir1='C:\temp\aaa'
$dir2='C:\temp\bbb'
$CmdPath = "C:\Program Files (x86)\WinRAR\RAR.exe"
$Files2Extract = get-childitem -path 'C:\temp\aaa' -recurse -name -filter *.rar
#$d2 = get-childitem -path $dir2 -recurse
foreach($file in $Files2Extract){
$justname = $dir2+'\\'+(split-path $file -leaf).split(".")[0]+'.trn'
if(!(Test-Path -path $justname)) {
&$CmdPath e $file $dir2
}
}