copying files from recycle bin - powershell

Here (Listing files in recycle bin) I've found a #Smile4ever post saying how to get the original location for the files in recycle bin:
(New-Object -ComObject Shell.Application).NameSpace(0x0a).Items()
|select #{n="OriginalLocation";e={$_.ExtendedProperty("{9B174B33-40FF-11D2-A27E-00C04FC30871} 2")}},Name
| export-csv -delimiter "\" -path C:\Users\UserName\Desktop\recycleBinFiles.txt -NoTypeInformation
(gc C:\Users\UserName\Desktop\recycleBinFiles.txt | select -Skip 1)
| % {$_.Replace('"','')}
| set-content C:\Users\UserName\Desktop\recycleBinFiles.txt
I'd like to copy them somewhere (in case I've been told that some of them were not to be deleted and someone empty recycle bin).
Here (https://superuser.com/questions/715673/batch-script-move-files-from-windows-recycle-bin) I've found a #gm2 post for copying them
$shell = New-Object -ComObject Shell.Application
$recycleBin = $shell.Namespace(0xA) #Recycle Bin
$recycleBin.Items() | %{Copy-Item $_.Path ("C:\Temp\{0}" -f $_.Name)}
And they work fine, but I'd need something more.
I don't know anything about powershell, but what I'd like to do would be:
for each file in Recycle bin to create its original location folder in the backup folder C:\Temp and to copy the file there (so I won't have the problem of more files with the same name).
And then to zip that C:\Temp.
Is there a way foir doing it?
Thanks!

You should be able to do it like this:
# Set a folder path INSIDE the C:\Temp folder to collect the files and folders
$outputPath = 'C:\Temp\RecycleBackup'
# afterwards, a zip file is created in 'C:\Temp' with filename 'RecycleBackup.zip'
$shell = New-Object -ComObject Shell.Application
$recycleBin = $shell.Namespace(0xA)
$recycleBin.Items() | ForEach-Object {
# see https://learn.microsoft.com/en-us/windows/win32/shell/shellfolderitem-extendedproperty
$originalPath = $_.ExtendedProperty("{9B174B33-40FF-11D2-A27E-00C04FC30871} 2")
# get the root disk from that original path
$originalRoot = [System.IO.Path]::GetPathRoot($originalPath)
# remove the root from the OriginalPath
$newPath = $originalPath.Substring($originalRoot.Length)
# change/remove the : and \ characters in the root for output
if ($originalRoot -like '?:\*') {
# a local path. X:\ --> X
$newRoot = $originalRoot.Substring(0,1)
}
else {
# UNC path. \\server\share --> server_share
$newRoot = $originalRoot.Trim("\") -replace '\\', '_'
#"\"# you can remove this dummy comment to restore syntax highlighting in SO
}
$newPath = Join-Path -Path $outputPath -ChildPath "$newRoot\$newPath"
# if this new path does not exist yet, create it
if (!(Test-Path -Path $newPath -PathType Container)) {
New-Item -Path $newPath -ItemType Directory | Out-Null
}
# copy the file or folder with its original name to the new output path
Copy-Item -Path $_.Path -Destination (Join-Path -Path $newPath -ChildPath $_.Name) -Force -Recurse
}
# clean up the Com object when done
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($shell) | Out-Null
[System.GC]::Collect()
[System.GC]::WaitForPendingFinalizers()
$shell = $null
The following code needs PowerShell version 5
# finally, create a zip file of this RecycleBackup folder and everything in it.
# append a '\*' to the $outputPath variable to enable recursing the folder
$zipPath = Join-Path -Path $outputPath -ChildPath '*'
$zipFile = '{0}.zip' -f $outputPath.TrimEnd("\")
#"\"# you can remove this dummy comment to restore syntax highlighting in SO
# remove the zip file if it already exists
if(Test-Path $zipFile -PathType Leaf) { Remove-item $zipFile -Force }
Compress-Archive -Path $zipPath -CompressionLevel Optimal -DestinationPath $zipFile -Force
To create a zip file in PowerShell below version 5
If you do not have PowerShell 5 or up, the Compress-Archive is not available.
To create a zip file from the C:\Temp\RecycleBackup you can do this instead:
$zipFile = '{0}.zip' -f $outputPath.TrimEnd("\")
#"\"# you can remove this dummy comment to restore syntax highlighting in SO
# remove the zip file if it already exists
if(Test-Path $zipFile -PathType Leaf) { Remove-item $zipFile -Force }
Add-Type -AssemblyName 'System.IO.Compression.FileSystem'
[System.IO.Compression.ZipFile]::CreateFromDirectory($outputPath, $zipFile)
Of course, you can also use a third party software like 7Zip for this. There are plenty of examples on the net how to use that in Powershell like for instance here
As per your last request to remove the 'RecycleBackup' folder after the zip is created
Remove-Item -Path $outputPath -Recurse -Force
Hope that helps

Related

Compress-Archive handle files with same name in Powershell

I'm writing an archiving script which collecting desired files to an array then adding them to an archive 1 by 1.
I came to a problem when there is DIR1/file.ext and DIR2/file.ext because DIR2's file going to overwrite the previous.
How can I set unique filename or how it's possible to solve it on the fly instead of copying files to a dir with structures then zip the whole dir?
Here is my code:
# GET FILE LIST
$outgoingfiles = Get-ChildItem -Depth 1 -Filter "*.EXT" | Where-Object { $_.DirectoryName -like "*OUTGOING*" }
# Handle if OUTGOING/archive dir is exists
if(-not (Test-Path "OUTGOING/archive")) {
New-Item -Path "OUTGOING/archive" -ItemType Directory
}
# ZIP outgoing files
ForEach ($outgoing in $outgoingfiles) {
Compress-Archive $outgoing.FullName -Update -DestinationPath $zippath
}
Thank you!
I don't think there is a way to tell Compress-Archive to rename files when a file with the same name is already included in the zip.
What you can do is create a temporary folder, copy all files to there and if needed rename them. Then create the zip file using the unique files in that folder.
Finally, remove the temp folder again:
$zippath = 'D:\Test\OutGoing.zip' # path and filename for the output zip file
$rootPath = 'D:\Test' # where the files can be found
# create a temporary folder to uniquely copy the files to
$tempFolder = Join-Path -Path ([System.IO.Path]::GetTempPath()) -ChildPath ([Guid]::NewGuid().Guid)
$null = New-Item -ItemType Directory -Path $tempFolder
# create a hashtable to store the fileHash already copied
$fileHash = #{}
# get the list of files and copy them to a temporary folder
Get-ChildItem -Path $rootPath -Depth 1 -Filter '*.EXT' -File | Where-Object { $_.DirectoryName -like "*OUTGOING*" } | ForEach-Object {
$count = 1
$newName = $_.Name
# test if the file name is already in the hash and if so, append a counter to the basename
while ($fileHash.ContainsKey($newName)) {
$newName = "{0}({1}){2}" -f $_.BaseName, $count++, $_.Extension
}
# store this file name in the hash and copy the file
$fileHash[$newName] = $true
$newFile = Join-Path -Path $tempFolder -ChildPath $newName
$_ | Copy-Item -Destination $newFile -Force
}
# append '*.*' to the temporary folder name.
$path = Join-Path -Path $tempFolder -ChildPath '*.*'
# next, get the list of files in this temp folder and start archiving
Compress-Archive -Path $path -DestinationPath $zippath -Update
# when done, remove the tempfolder and files
Remove-Item -Path $tempFolder -Force -Recurse
Hope that helps
I would just copy the files along with their parent directories to a destination folder, then zip it up with Compress-Archive. Then you don't have to worry about making filenames unique.
Demo:
$sourceFolder = "C:\\"
$destinationFolder = "C:\\OUTGOING"
# Create destination folder if it doesn't exist
if (-not(Test-Path -Path $destinationFolder -PathType Container))
{
New-Item -Path $destinationFolder -ItemType Directory
}
# Get all .exe files one level deep
$files = Get-ChildItem -Path $sourceFolder -Depth 1 -Filter *.ext
foreach ($file in $files)
{
# Get standalone parent directory e.g. DIR1, DIR2
$parentFolder = Split-Path -Path (Split-Path -Path $file.FullName) -Leaf
# Create destination path with this parent directory
$destination = Join-Path -Path $destinationFolder -ChildPath $parentFolder
# Create destination parent directory if it doesn't exist
if (-not(Test-Path -Path $destination -PathType Container))
{
New-Item -Path $destination -ItemType Directory
}
# Copy file to parent directory in destination
Copy-Item -Path $file.FullName -Destination $destination
}
# Zip up destination folder
# Make sure to pass -Update for redoing compression
Compress-Archive -Path $destinationFolder -DestinationPath "OUTGOING.zip" -Update -CompressionLevel Optimal

powershell Script for unzipping in specific named folder

I have several zip files which have generated names like 21321421-12315-sad3fse-23434fg-ggfsd which doesn't help to identify the content of the zip.
I need a script, which unzips it and then looks for a pdf file with a partly-generated & static name eg asdawd-ersrfse-231-Formular2311.
After that it should create a folder with the name of the pdf file and unzip all zip-file content into this folder.
So far I only have to snippets that work after each other, but I'm still stuck.
$shell = new-object -com shell.application
$CurrentLocation = get-location
$CurrentPath = $CurrentLocation.path
$Location = $shell.namespace($CurrentPath)
# Find all the Zip files and Count them
$ZipFiles = get-childitem -Path "C:\Install\NB\Teststart" *.zip
$ZipFiles.count | out-default
# Set the Index for unique folders
$Index = 1
# For every zip file in the folder
foreach ($ZipFile in $ZipFiles) {
# Get the full path to the zip file
$ZipFile.fullname | out-default
# Set the location and create a new folder to unzip the files into - Edit the line below to change the location to save files to
$NewLocation = "C:\Install\NB\Testfinal\$Index"
New-Item $NewLocation -type Directory
# Move the zip file to the new folder so that you know which was the original file (can be changed to Copy-Item if needed)
Copy-Item $ZipFile.fullname $NewLocation
# List up all of the zip files in the new folder
$NewZipFile = get-childitem $NewLocation *.zip
# Get the COMObjects required for the unzip process
$NewLocation = $shell.namespace($NewLocation)
$ZipFolder = $shell.namespace($NewZipFile.fullname)
# Copy the files to the new Folder
$NewLocation.copyhere($ZipFolder.items())
# Increase the Index to ensure that unique folders are made
$Index = $Index + 1
}
Get-ChildItem -Path "C:\Install\NB\Testfinal" -Include "*.pdf" -Recurse | ForEach-Object {
$oldFolder = $_.DirectoryName
# New Folder Name is .pdf Filename, excluding extension
$newFolder = $_.Name.Substring(0, $_.Name.Length - 4)
# Verify Not Already Same Name
Write-Host "Rename: $oldFolder To: $newFolder"
Rename-Item -NewName $newFolder -Path $oldFolder
}
Along the same lines as your own script, firstly extract the zips and then rename the extracted folder to the same as the pdf:
$SourceDir = 'C:\Install\NB\Teststart'
$ExtractDir = 'C:\Install\NB\Testfinal'
# Extract each zip to a folder with the same name as the zip file (BaseName)
Get-ChildItem (Join-Path $SourceDir *.zip) | foreach {
Expand-Archive $_.FullName -DestinationPath (Join-Path $ExtractDir $_.BaseName)
}
# Rename the PDF's parent folder to the same as the PDF
Get-ChildItem (Join-Path $ExtractDir *.pdf) -Recurse | foreach {
Rename-Item -Path $_.Directory.FullName -NewName $_.BaseName
}
This should do and it's much simpler than what you have. It relies on .NET 4.5 which you should have on your server already:
[Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem')
# Get all the zip files in the root of the script, change $PSScriptRoot accordingly
Get-ChildItem -Path $PSScriptRoot -Filter *.zip -Recurse | ForEach-Object {
# Open the archive for reading
$zip = [IO.Compression.ZipFile]::OpenRead($_.FullName)
# Find the name of the PD file from the archive entries
$archiveName = $zip.Entries | `
Where-Object { [System.IO.Path]::GetExtension($_.FullName) -eq '.pdf' } | `
Select-Object #{N = "BaseName"; E = {[System.IO.Path]::GetFileNameWithoutExtension($_.FullName)}} |
Select-Object -Expand BaseName
# Close the zip file
$zip.Dispose()
# Use the native Expand-Archive to unzip the archive
# Ammand $PSScriptRoot to the destination base path if needed
Expand-Archive -Path $_.FullName -DestinationPath (Join-Path $PSScriptRoot $archiveName)
}

compress-archive and preserve relative paths

I'm having a challenging time getting compress-archive to do what I want...
I have a root project folder and I want to zip up some of the files in sub-directories and preserve the relative paths. For example:
/
├── _scripts
├── ├─_module1
| | └── filex.js
| └─_module2
| ├── file1.js
| └── file2.txt
So from my root directory, i'd like to create a zip file that includes module2/*, and i want to keep the folder structure. I'd like my zip to contain:
scripts/module2/file1.js
scripts/module2/file2.txt
But when I run this from the root folder:
Compress-Archive -Path "scripts\module2\*" -DestinationPath tmp.zip
The contents of the zip file only contain:
/file1.js
/file2.txt
It appears that Compress-Archive (as of Windows PowerShell v5.1) doesn't support what you want:
Targeting a folder recursively adds that folder's subtree to the archive, but only by the target folder's name (which becomes a child folder inside the archive), not its path.
Specifically,
Compress-Archive -Path scripts\module2 -DestinationPath tmp.zip
will (recursively) store the contents of scripts\module2 in tmp.zip, but not with archive-internal path .\scripts\module2, just with .\module2 - the target folder's name (the last input path component).
The implication is that you'd have to pass folder scripts instead to get the desired archive-internal path, but that would invariably include the entire subtree of scripts, given that Compress-Archive offers no inclusion/exclusion mechanism.
One - cumbersome - option is to recreate the desired hierarchy in, say, the $env:TEMP folder, copy the target folder there, run Compress-Archive against the root of the recreated hierarchy, and then clean up:
New-Item -Force -ItemType Directory $env:TEMP/scripts
Copy-Item -Recurse -Force scripts/module2 $env:TEMP/scripts
Compress-Archive -LiteralPath $env:TEMP/scripts -DestinationPath tmp.zip
Remove-Item $env:TEMP/Scripts -Recurse -Whatif
Otherwise, you may be able to find a solution:
by using the .NET v4.5+ [System.IO.Compression.ZipFile] class directly; you can load it into your session with Add-Type -Assembly System.IO.Compression.FileSystem (not necessary in PowerShell Core).
by using external programs such as 7-Zip,
I wanted to do this without having to copy the full structure to a temp directory.
#build list of files to compress
$files = #(Get-ChildItem -Path .\procedimentos -Recurse | Where-Object -Property Name -EQ procedimentos.xlsx);
$files += #(Get-ChildItem -Path .\procedimentos -Recurse | Where-Object -Property Name -CLike procedimento_*_fs_*_d_*.xml);
$files += #(Get-ChildItem -Path .\procedimentos -Recurse | Where-Object -Property FullName -CLike *\documentos_*_fs_*_d_*);
# exclude directory entries and generate fullpath list
$filesFullPath = $files | Where-Object -Property Attributes -CContains Archive | ForEach-Object -Process {Write-Output -InputObject $_.FullName}
#create zip file
$zipFileName = 'procedimentos.zip'
$zip = [System.IO.Compression.ZipFile]::Open((Join-Path -Path $(Resolve-Path -Path ".") -ChildPath $zipFileName), [System.IO.Compression.ZipArchiveMode]::Create)
#write entries with relative paths as names
foreach ($fname in $filesFullPath) {
$rname = $(Resolve-Path -Path $fname -Relative) -replace '\.\\',''
echo $rname
$zentry = $zip.CreateEntry($rname)
$zentryWriter = New-Object -TypeName System.IO.BinaryWriter $zentry.Open()
$zentryWriter.Write([System.IO.File]::ReadAllBytes($fname))
$zentryWriter.Flush()
$zentryWriter.Close()
}
# clean up
Get-Variable -exclude Runspace | Where-Object {$_.Value -is [System.IDisposable]} | Foreach-Object {$_.Value.Dispose(); Remove-Variable $_.Name};
It is a bit old thread, but I think this will help folks to create zip files through PowerShell 5.1, which is standard with Windows 10 installations these days. Script allows you to keep original subdirectory structure as well as to exclude some unnecessary subtrees / files. This is what I use to archive source code of my Visual Studio solutions:
Write-Output "Zipping Visual Studio solution..."
# top level from where to start and location of the zip file
$path = "C:\TheSolution"
# top path that we want to keep in the source code zip file
$subdir = "source\TheSolution"
# location of the zip file
$ZipFile = "${path}\TheSolution.zip"
# change current directory
Set-Location "$path"
# collecting list of files that we want to archive excluding those that we don't want to preserve
$Files = #(Get-ChildItem "${subdir}" -Recurse -File | Where-Object {$_.PSParentPath -inotmatch "x64|packages|.vs|Win32"})
$Files += #(Get-ChildItem "${subdir}\packages" -Recurse -File)
$Files += #(Get-ChildItem "${subdir}\.git" -Recurse -File)
$FullFilenames = $files | ForEach-Object -Process {Write-Output -InputObject $_.FullName}
# remove old zip file
if (Test-Path $ZipFile) { Remove-Item $ZipFile -ErrorAction Stop }
#create zip file
Add-Type -AssemblyName System.IO.Compression
Add-Type -AssemblyName System.IO.Compression.FileSystem
$zip = [System.IO.Compression.ZipFile]::Open(($ZipFile), [System.IO.Compression.ZipArchiveMode]::Create)
# write entries with relative paths as names
foreach ($fname in $FullFilenames) {
$rname = $(Resolve-Path -Path $fname -Relative) -replace '\.\\',''
Write-Output $rname
$zentry = $zip.CreateEntry($rname)
$zentryWriter = New-Object -TypeName System.IO.BinaryWriter $zentry.Open()
$zentryWriter.Write([System.IO.File]::ReadAllBytes($fname))
$zentryWriter.Flush()
$zentryWriter.Close()
}
# release zip file
$zip.Dispose()
The cumbersome technique mklement0 mentioned worked for me. Below is the script I created to support a list of various files mixed with folders.
# Compress LFS based files into a zip
# To use
# 1. place this script in the root folder
# 2. modify the contents of $lfsAssetFiles to point to files relative to this root folder
# 3. modify $zipDestination to be where you want the resultant zip to be placed
# based off of https://stackoverflow.com/a/51394271
# this should match files being .gitignored
$lfsAssetFiles =
"\Assets\Project\Plugins\x32",
"\Assets\Project\Plugins\x64\HugePlugin.dll"
# This is where the contents of the zip file will be structured, because placing them inside of a specific folder of the zip is difficult otherwise
$zipStruct = $PSScriptRoot + "\zipStruct"
# the actual zip file that will be created
$zipDestination = "C:\Dropbox\GitLfsZip\ProjectNameLfs.zip"
# remove files from previous runs of this script
If(Test-path $zipStruct) {Remove-item $zipStruct -Recurse}
If(Test-path $zipDestination) {Remove-item $zipDestination}
Foreach ($entry in $lfsAssetFiles)
{
# form absolute path to source each file to be included in the zip
$sourcePath = $PSScriptRoot + $entry;
# get the parent directories of the path. If the entry itself is a directory, we still only need the parent as the directory will be created when it is copied over.
$entryPath = Split-Path -Parent $entry
# form what the path will look like in the destination
$entryPath = $zipStruct + $entryPath
# ensure the folders to the entry path exist
$createdPath = New-Item -Force -ItemType Directory $entryPath
# copy the file or directory
Copy-Item -Recurse -Force $sourcePath $createdPath
}
# create a zip file https://blogs.technet.microsoft.com/heyscriptingguy/2015/page/59/
Add-Type -AssemblyName "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($zipStruct, $zipDestination)
# Compress-Archive doesn't work here because it includes the "zipStruct" folder: Compress-Archive -Path $zipStruct -DestinationPath $zipDestination

How to do custom file copy in Powershell?

Below power shell script will copy and replace all the contents from source to destination.
Copy-Item -Path $sourcePath -Destination $installationPath -Recurse -Force
This looks simple. But in my case, I need to implement a custom file copy logic.
Logging everything about files being copied.
Skipping certain set of folders.
Sample script:
[Array]$files=Get-ChildItem ($sourceDirectoryPath) -Recurse | Format-Table Name, Directory, FullName
for($i=0;$i -le $files.Length-1;$i++)
{
. . . . . .
# Build destination path based on the source path.
# Check and create folders if it doesn't exists
# Add if cases to skip certain parts.
Copy-Item -Force $sourcePath -Destination $destinationPath
}
Any ideas on how to achieve this? Any other ideas also will help.
Thanks.
Something like this:
$ErrorActionPreference = "Stop"
Set-StrictMode -Version Latest
$sourceDir = "c:\temp\source"
$targetDir = "c:\temp\target"
$skipFiles = #(
"skip.me",
"and.me"
)
Get-ChildItem -Path $sourceDir -Recurse | ForEach-Object {
# ignore folders
if ($_.PSIsContainer)
{
return
}
# skip this file?
if ($skipFiles -contains $_.Name)
{
Write-Verbose "Skipping '$_.FullName'"
return
}
# create target folder when needed
$path = $_.DirectoryName -replace "^$([RegEx]::Escape($sourceDir))",$targetDir
if (!(Test-Path $path))
{
Write-Verbose "Creating target path '$path'..."
New-Item -Path $path -ItemType Directory
}
# copy the file
Write-Verbose "Copying '$_.FullName' to '$path'..."
Copy-Item $_.FullName $path | Out-Null
}

PowerShell Compress-Archive By File Extension

How can I use the PowerShell 5.0 Compress-Archive cmdlet to recursively take any .config files in a directory and zip them up while maintaining the directory structure. Example:
Directory1
Config1.config
Directory2
Config2.config
The aim is a single zip file also containing the above directory structure and only config files.
I would suggest copying the files to a temporary directory and compress that. Ex:
$path = "test"
$filter = "*.config"
#To support both absolute and relative paths..
$pathitem = Get-Item -Path $path
#If sourcepath exists
if($pathitem) {
#Get name for tempfolder
$tempdir = Join-Path $env:temp "CompressArchiveTemp"
#Create temp-folder
New-Item -Path $tempdir -ItemType Directory -Force | Out-Null
#Copy files
Copy-Item -Path $pathitem.FullName -Destination $tempdir -Filter $filter -Recurse
#Get items inside "rootfolder" to avoid that the rootfolde "test" is included.
$sources = Get-ChildItem -Path (Join-Path $tempdir $pathitem.Name) | Select-Object -ExpandProperty FullName
#Create zip from tempfolder
Compress-Archive -Path $sources -DestinationPath config-files.zip
#Remove temp-folder
Remove-Item -Path $tempdir -Force -Recurse
}