Compress files in folder to zip file by using PS - powershell

I have the following scripts to compress a folder (all files in the folder) to a zip file:
set-content $zipFileName ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
$ZipFile = (new-object -com shell.application).NameSpace($zipFileName)
Get-ChildItem $folder | foreach {$zipFile.CopyHere($_.fullname)}
where $folder = "C:\Test", and $zipFileName = "C:\data\test.zip" as example
It works fine if "C:\Test" contains no empty sub-folders, and it seems works recursively to compress all files within sub-folders. I really like above simple line script. For example:
C:\Test
file1.dat
file2.dat
Test-Sub
File21.bat
....
However, I got error in one case. I find that if there is any empty folder such as "C:\Test\EmptySub",
C:\Test
file1.dat
file2.dat
Test-Sub
File21.bat
....
EmptySub
AnotherSub
file31.sp1
...
the script will generate an error. I tried the following scripts:
Get-ChildItem $files -Recurse | foreach { if (!$_.PSIsContainer)
{$zipFile.CopyHere($_.fullname)}}
This does not work as expected. It just skips all the sub-folders. Not sure if there are filter or clause available to skip all the empty sub-folders?
Updated: Based on suggests, I gave it a try. My problem has not be resolved. Here is the update of my question. First, I updated the scripts above to show how $zipFile object is created. Secondly I have the suggested codes:
Get-ChildItem $files | ? {-not ($_.PSIsContainer -eq $True -and
$_.GetFiles().Count -eq 0) } | % {$zipfile.CopyHere($_.fullname)}
I tried above updates on my WindowsXP, it works fine with empty sub-folders. However, the same codes do not workin in Windows 2003 Server. The following is the error message:
[Window Title]
Compressed (zipped) Folders Error
[Content]
File not found or no read permission.
[OK]
Not sure if this type PK object works in Windows 2003 server, or if there is other settings for the object.

You can detect empty directories by testing against an empty return from get-childitem. For example, this will return the list of empty directories
dir | where{$_.PSIsContainer -and -not (gci $_)}
Though in your case, you want the inverse:
Get-ChildItem $files | where{-not ($_.PSIsContainer -and -not (gci $_))} | foreach {$zipfile.CopyHere($_.fullname)}}

Your code works recursively in the sense that you get any kind of child item (folders included). If your intention is to exclude empty folder you should filter them:
Try this one liner:
gci $folder |`
? {-not ($_.PSIsContainer -eq $True -and $_.GetFiles().Count -eq 0) } |`
% {$zipfile.CopyHere($_.fullname)}

Here is the function I created to zip files. The reason why you are getting the read error is because .CopyHere is an asynchronous copy process, so my script verifies the file exists in the zip file before continuing to the next file to zip:
function Out-ZipFile ([string]$path)
{
try
{
$files = $input
if ($path.StartsWith(".\")) { $path=$path.replace(".\","$($pwd)\") }
if (-not $path.Contains("\")) { $path="$($pwd)\$($path)" }
if (-not $path.EndsWith('.zip')) {$path += '.zip'}
if (-not (test-path $path)) {
set-content $path ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
}
$ZipFile = (new-object -com shell.application).NameSpace($path)
$files | % {
$FileName = $_.name
"Adding $FileName to $path";
$ZipFile.CopyHere($_.fullname);
While (($ZipFile.Items() | where { $_.Name -like $FileName }).Size -lt 1) { Start-Sleep -m 100 };
}
}
catch
{
write-output "Error Encountered: `n$_"
return $_
throw
}
}

Related

Powershell script to copy old files and create shortcuts

I am working on creating a stubbing script using powershell. My intentions for this script is to copy any data that hasn't been written to in the past x time period to a new location, check that the file copied, create a shortcut to the "archived" file in it's original location, and eventually delete the old file (I haven't written this section yet). Below is what I have so far. The issue I am having now is that a shortcut is created however all of the shortcuts are saved to the C:\Temp directory and not in any subfolders; if that is where the original file was stored. I think my issue is with $link and I need to split the path's and join them but I am not sure. Any assistance is greatly appreciated!
# Variables
$Original = "C:\Temp"
$Archive = "\\data\users\Temp"
Get-ChildItem -Path $Original -Recurse |
Where-Object {
$_.LastWriteTime -lt [datetime]::Now.AddMinutes(-1)
} | Copy-Item -Destination $Archive -Recurse -force
$sourceFiles = Get-ChildItem $Original -Recurse | Where-Object { $_.LastWriteTime -lt [datetime]::Now.AddMinutes(-1) } | Select Name, Fullname
$destFiles = Get-ChildItem $Archive -Recurse | Select Name, Fullname
$Comparison = Compare-Object $sourceFiles.name $destFiles.name
If ($comparison.sideindicator -ne "<=") {
get-childitem $Archive -Recurse | where { $_.PsIsContainer -eq $false } | ForEach-Object {
$path = '"' + $_.FullName + '"'
$link = $Original + '\' + $_.Basename + '.lnk'
$wshell = New-Object -ComObject WScript.Shell
$shortcut = $wshell.CreateShortcut($link)
$shortcut.TargetPath = $path
$shortcut.Save()
}
}
If ($comparison.sideindicator -eq "<=") {
$comparison.inputobject, $sourceFiles.fullname | Out-File 'C:\ScriptLogs\stubbing.csv' -Force
}
This is the problematic code:
$link = $Original + '\' + $_.Basename + '.lnk'
Basename is only the filename portion without the path, so the .lnk files end up directly in the top-level directory.
You have to use the relative path like this:
$RelativePath = [IO.Path]::GetRelativePath( $Archive, $_.FullName )
$link = [IO.Path]::ChangeExtension( "$Original\$RelativePath", 'lnk' )
This requires .NET 5 for GetRelativePath. To support older .NET versions you can use:
Push-Location $Archive
try { $RelativePath = Resolve-Path -Relative $_.FullName }
finally { Pop-Location }
Resolve-Path -Relative uses the current location as the base base. So we use Push-Location to temporarily change the current location. The try / finally is used to ensure restoring of the original location even in case Resolve-Path throws an exception (e. g. when you have set $ErrorActionPreference = 'Stop').
Although the code might work like this, I think it could be refactored like this:
A single Get-ChildItem call to store the files to be moved into an array (your current $sourceFiles = ... line)
A foreach loop over this array to move each file and create the shortcut.
I think the Compare-Object isn't even necessary, but maybe I'm missing something.
Currently you are iterating over the same directories multiple times, which isn't very efficient and a lot of duplicate code.

I have issues in powershell script which will filter files of last 24 to 25 hrs and zip all those fines and ready for backup upload?

$source = "C:\folder\*.*"
$destination = "C:\folder3\test.zip"
$results = (Get-ChildItem -Path $source -Filter *.* | ? {
$_.LastWriteTime -gt (Get-Date).AddDays(-1))
}
Add-Type -assembly "system.io.compression.filesystem"
[io.compression.zipfile]::CreateFromDirectory($results, $destination)
Now i need to select last modified files and zip either by compress or 7zip any help ?
To zip files via Powershell, I usually install 7-zip on the system and then include this function in the script:
#7-ZIP FILE FUNCTION (must install 7-zip on system)
#-------------------------------------------------------------------------------------
#
function ZipFile7 ([string] $pZipFile, [string] $pFiles) {
$7zipExe = "$($env:programfiles)\7-Zip\7z.exe"
$7zArgs = #("a", "-tzip", $pZipFile, $pFiles, "-r")
& $7zipExe $7zArgs
}
#-------------------------------------------------------------------------------------
$pZipFile is the full path of the file you wish to create, and $pFiles is the path to the files you wish to zip (can include wildcards like C:\*.txt).
$source = "C:\folder\*.*"
$destination = "C:\folder3\test.zip"
$results = Get-ChildItem -Path $source | Where {$_.LastWriteTime -gt (Get-Date).AddDays(-1)} | Select -ExpandProperty "FullName"
set-content $destination ("PK" + [char]5 + [char]6 + ("$([char]0)" * 18))
$Shell = New-Object -Com Shell.Application
$ZipFile = $Shell.Namespace($destination)
$Results | ForEach {
$Count = $ZipFile.Items().Count
$ZipFile.CopyHere($_)
While ($ZipFile.Items().Count -eq $Count) {Start-Sleep -Milliseconds 200} # Wait till the file is zipped
}
Note: you might want to improve the while/wait statement (e.g. set a timeout) as it might hang if anything goes wrong with copying the file to the zip folder.

Powershell copy file after a date has passed with file structure

I am trying to copy a file off a server and onto another, I want to keep the structure of the file like so C:\folder\folder\file! If the folder is there copy the file into it, if it is not then create the folders and then copy into it!
I would like it also to filter out the files that are still needed so I want to keep files for 30 days and then move them!
Blockquote
`[int]$Count = 0
$filter = (Get-Date).AddDays(-15).ToString("MM/dd/yyyy")
Get-WMIObject Win32_LogicalDisk | ForEach-Object{
$SearchFolders = Get-Childitem ($_.DeviceID + "\crams") -recurse
$FileList = $SearchFolders |
Where-Object {$_.name -like "Stdout_*" -and $_.lastwritetime -le $filter}
[int]$Totalfiles = ($FileList | Measure-object).count
write-host "There are a total of $Totalfiles found."
echo $FileList
start-sleep 30
[int]
ForEach ($Item in $FileList)
{$Count++
$File = $Item
Write-Host "Now Moving $File"
$destination ="C:\StdLogFiles\"
$path = test-Path (get-childitem $destination -Exclude "Stdout_*")
if ($path -eq $true) {
write-Host "Directory Already exists"
copy-item $File -destination $destination
}
elseif ($path -eq $false) {
cd $destination
mkdir $File
copy-Item $File -destination $destination
}
}
}`
Is what I have so far it has changed a lot due to trying to get it to work but the search works and so does the date part I can not get it to keep the structure of the file!
Okay I took out the bottom part and put in
ForEach ($Item in Get-ChildItem $FileList)
also tried get-content but path is null
{$Count++
$destination = "C:\StdLogFiles"
$File = $Item
Write-Host "Now Moving $File to $destination"
Copy-Item -Path $file.fullname -Destination $destination -force}}
it is copying everything that is in c into that folder but not the files I do not understand what it is doing now! I had it copying the files even wen back to an older version and can't get it to work again! I am going to leave it before I break it more!
Any help or thoughts would be appreciated
I think RoboCopy is probably a simpler solution for you to be honest. But, if you insist on using PowerShell you are going to need to setup your destination better if you want to keep your file structure. You also want to leave your filter date as a [DateTime] object instead of converting it to a string since what you are comparing it to (lastwritetime) is a [DateTime] object. You'll need to do something like:
$filter = (Get-Date).AddDays(-15)
$FileList = Get-WMIObject Win32_LogicalDisk | ForEach-Object{
Get-Childitem ($_.DeviceID + "\crams") -recurse | Where-Object {$_.name -like "Stdout_*" -and $_.lastwritetime -le $filter}
}
$Totalfiles = $FileList.count
For($i = 1;$i -le $TotalFiles; $i++)
{
$File = $FileList[($i-1)]
Write-Progress -Activity "Backing up old files" -CurrentOperation ("Copying file: " + $file.Name) -Status "$i of $Totalfiles files" -PercentComplete ($i*100/$Totalfiles)
$Destination = (Split-Path $file.fullname) -replace "^.*?\\crams", "C:\StdLogFiles"
If(!(Test-Path $Destination)){
New-Item -Path $Destination -ItemType Directory | Out-Null
}
Copy-Item $File -Destination $Destination
}
Write-Progress -Completed
That gathers all the files you need to move from all disks. Takes a count of them, and then enters a loop that will cycle as many times as you have files. In the loop is assigns the current item to a variable, then updates a progress bar based on progress. It then parses the destination by replacing the beginning of the file's full path (minus file name) with your target destination of 'C:\StdLogFiles'. So D:\Crams\HolyPregnantNunsBatman\Stdout04122015.log becomes C:\StdLogFiles\HolyPregnantNunsBatman. Then it tests the path, and if it's not valid it creates it (piped to out-null to avoid spam). Then we copy the file to the destination and move on to the next item. After the files are done we close out the progress bar.

Powershell test if folder empty

In Powershell, how do I test if a directory is empty?
If you are not interested in hidden or system files you can also use Test-Path
To see if it exists a file in directory .\temp you can use :
Test-Path -Path .\temp\*
or shortly :
Test-Path .\temp\*
Try this...
$directoryInfo = Get-ChildItem C:\temp | Measure-Object
$directoryInfo.count #Returns the count of all of the objects in the directory
If $directoryInfo.count -eq 0, then your directory is empty.
To prevent enumerating each file under c:\Temp (which can be time consuming), we can do somethings like this:
if((Get-ChildItem c:\temp\ -force | Select-Object -First 1 | Measure-Object).Count -eq 0)
{
# folder is empty
}
filter Test-DirectoryEmpty {
[bool](Get-ChildItem $_\* -Force)
}
One line:
if( (Get-ChildItem C:\temp | Measure-Object).Count -eq 0)
{
#Folder Empty
}
It's a waste to get all files and directories and count them only to determine if directory is empty. Much better to use .NET EnumerateFileSystemInfos
$directory = Get-Item -Path "c:\temp"
if (!($directory.EnumerateFileSystemInfos() | select -First 1))
{
"empty"
}
You can use the method .GetFileSystemInfos().Count to check the count of directories. Microsoft Docs
$docs = Get-ChildItem -Path .\Documents\Test
$docs.GetFileSystemInfos().Count
Simple approach
if (-Not (Test-Path .\temp*))
{
# do your stuff here
}
you can remove -Not if you want to enter the 'if' when files are present.
#Define Folder Path to assess and delete
$Folder = "C:\Temp\Stuff"
#Delete All Empty Subfolders in a Parent Folder
Get-ChildItem -Path $Folder -Recurse -Force | Where-Object { $_.PSIsContainer -and (Get-ChildItem -Path $_.FullName -Recurse -Force | Where-Object { !$_.PSIsContainer }) -eq $null } | Remove-Item -Force -Recurse
#Delete Parent Folder if empty
If((Get-ChildItem -Path $Folder -force | Select-Object -First 1 | Measure-Object).Count -eq 0) {Remove-Item -Path $CATSFolder -Force -Recurse}
Just adding to JPBlanc, if directory path is $DirPath, this code also works for paths including square bracket characters.
# Make square bracket non-wild card char with back ticks
$DirPathDirty = $DirPath.Replace('[', '`[')
$DirPathDirty = $DirPathDirty.Replace(']', '`]')
if (Test-Path -Path "$DirPathDirty\*") {
# Code for directory not empty
}
else {
# Code for empty directory
}
#################################################
# Script to verify if any files exist in the Monitor Folder
# Author Vikas Sukhija
# Co-Authored Greg Rojas
# Date 6/23/16
#################################################
################Define Variables############
$email1 = "yourdistrolist#conoso.com"
$fromadd = "yourMonitoringEmail#conoso.com"
$smtpserver ="mailrelay.conoso.com"
$date1 = get-date -Hour 1 -Minute 1 -Second 1
$date2 = get-date -Hour 2 -Minute 2 -Second 2
###############that needs folder monitoring############################
$directory = "C:\Monitor Folder"
$directoryInfo = Get-ChildItem $directory | Measure-Object
$directoryInfo.count
if($directoryInfo.Count -gt '0')
{
#SMTP Relay address
$msg = new-object Net.Mail.MailMessage
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
#Mail sender
$msg.From = $fromadd
#mail recipient
$msg.To.Add($email1)
$msg.Subject = "WARNING : There are " + $directoryInfo.count + " file(s) on " + $env:computername + " in " + " $directory
$msg.Body = "On " + $env:computername + " files have been discovered in the " + $directory + " folder."
$smtp.Send($msg)
}
Else
{
Write-host "No files here" -foregroundcolor Green
}
Example of removing empty folder:
IF ((Get-ChildItem "$env:SystemDrive\test" | Measure-Object).Count -eq 0) {
remove-Item "$env:SystemDrive\test" -force
}
$contents = Get-ChildItem -Path "C:\New folder"
if($contents.length -eq "") #If the folder is empty, Get-ChileItem returns empty string
{
Remove-Item "C:\New folder"
echo "Empty folder. Deleted folder"
}
else{
echo "Folder not empty"
}
One line for piping, by also using the GetFileSystemInfos().Count with a test :
gci -Directory | where { !#( $_.GetFileSystemInfos().Count) }
will show all directories which have no items. Result:
Directory: F:\Backup\Moving\Test
Mode LastWriteTime Length Name
---- ------------- ------ ----
d----- 5/21/2021 2:53 PM Test [Remove]
d----- 5/21/2021 2:53 PM Test - 1
d----- 5/21/2021 2:39 PM MyDir [abc]
d----- 5/21/2021 2:35 PM Empty
I post this because I was having edge-case issues with names that contained brackets [ ]; failure was when using other methods and the output piped to Remove-Item missed the directory names with brackets.
Getting a count from Get-ChildItem can provide false results because an empty folder or error accessing a folder could result in a 0 count.
The way I check for empty folders is to separate out errors:
Try { # Test if folder can be scanned
$TestPath = Get-ChildItem $Path -ErrorAction SilentlyContinue -ErrorVariable MsgErrTest -Force | Select-Object -First 1
}
Catch {}
If ($MsgErrTest) { "Error accessing folder" }
Else { # Folder can be accessed or is empty
"Folder can be accessed"
If ([string]::IsNullOrEmpty($TestPath)) { # Folder is empty
" Folder is empty"
}
}
The above code first tries to acces the folder. If an error occurs, it outputs that an error occurred. If there was no error, state that "Folder can be accessed", and next check if it's empty.
After looking into some of the existing answers, and experimenting a little, I ended up using this approach:
function Test-Dir-Valid-Empty {
param([string]$dir)
(Test-Path ($dir)) -AND ((Get-ChildItem -att d,h,a $dir).count -eq 0)
}
This will first check for a valid directory (Test-Path ($dir)). It will then check for any contents including any directories, hidden file, or "regular" files** due to the attributes d, h, and a, respectively.
Usage should be clear enough:
PS C_\> Test-Dir-Valid-Empty projects\some-folder
False
...or alternatively:
PS C:\> if(Test-Dir-Valid-Empty projects\some-folder){ "empty!" } else { "Not Empty." }
Not Empty.
** Actually I'm not 100% certain what the defined effect of of a is here, but it does in any case cause all files to be included. The documentation states that ah shows hidden files, and I believe as should show system files, so I'm guessing a on it's own just shows "regular" files. If you remove it from the function above, it will in any case find hidden files, but not others.

"$_.extension -eq" not working as intended?

I tried to write a small script to automate the creation of playlists (m3u) for dozens of folders/subfolders of mp3/mp4 files, while omitting various other misc files therein. I know very little about Powershell but managed to piece together something that almost works. The only blip is that when I use "$_.extension -eq", it doesn't seem to work, or at least I'm not using it right. If I use it to match log/txt files in a temp folder for example, it works, but not in this instance. Here is the code -
$pathname = read-host "Enter path"
$root = Get-ChildItem $pathname | ? {$_.PSIsContainer}
$rootpath = $pathname.substring(0,2)
Set-Location $rootpath
Set-Location $pathname
foreach($folder in $root) {
Set-Location $folder
foreach($file in $folder) {
$txtfile =".m3u"
$files = gci | Where-Object {$_.extension -eq ".mp3" -or ".mp4"}
$count = $files.count
if($count -ge 2){
$txtfile = "_" + $folder.name + $txtfile
Add-Content $txtFile $files
}
}
if(test-path $txtFile){
Add-Content $txtFile `r
}
Set-Location $pathname
}
I have tried several variations like swapping "-match" for "-eq" but no luck. incidentally, if I omit the "-or ".mp4"" from the parentheses then it works fine, but I need it to match both, and only both mp3/mp4.
Thanks in advance.
As far as you complain about extension, let's start with it. Presumably there is a bug in the code; this expression/syntax is technically valid:
$_.extension -eq ".mp3" -or ".mp4"
But apparently the intent was:
$_.extension -eq ".mp3" -or $_.extension -eq ".mp4"
Try the corrected expression at first.
I'm going to add this as a shortcut option:
gci | Where-Object {".mp3",".mp4" -eq $_.extension}