Non-Terminating Exception in Folder Deletion Script - powershell

I've written a Powershell script that would periodically delete folders on my machine.
The algorithm is as follows:
Drill down into each directory structure to the lowest subfolders
Check the creation date of the subfolder
If it's 14 days old, or older, delete it
LOG EVERYTHING (not part of the algorithm, just good practise)
When running, it operates exactly as expected...
... Except it throws the following, non-terminating exception:
Get-ChildItem : Could not find a part of the path 'C:\foo\baz'.
At C:\src\CoreDev\Trunk\Tools\BuildClean script\buildclean.ps1:55 char:15
+ Get-ChildItem <<<< -recurse -force |
+ CategoryInfo : ReadError: (C:\foo\baz:String) [Get-ChildItem],
DirectoryNotFoundException
+ FullyQualifiedErrorId : DirIOError,Microsoft.PowerShell.Commands.GetChil
dItemCommand
Why is this happening? More importantly, how can I remove it, and will it cause an issue?
The script is as follows:
# folderclean.ps1
# This script will remove each leaf node of a directory, provided that leaf is over
# 14 days old.
# CONSTANT DECLARATIONS
# testing (run on my local machine)
$proj_loc = "C:\foo", "C:\bar"
$logpath = "C:\Logs\BuildClean\$(Get-Date -format yyyyMMdd).log"
function Write-ToLogFile {
param ([string]$stringToWrite)
Add-Content $logpath -value $stringToWrite
}
# Function to check if a folder is a leaf folder.
# First, retrieve the directory $item is pointing to
# Then, create a list of children of $item that are folders
# If this list is either empty or null, return $true
# Otherwise, return $false
function Folder-IsLeaf($item) {
$ary = Get-ChildItem $item -force | ?{ $_.PSIsContainer }
if (($ary.length) -eq 0 -or $ary -eq $null) {
return $true
}
return $false
}
# Deletes leaf folders that are older than a certain threshhold.
# Get a list of children of the folder, where each child is a folder itself and
# was created over 14 days ago and the folder is a leaf
# For each of these children, delete them and increment $folderCount
# Get a list of children of the folder, where each child is a folder itself and
# was last modified over 14 days ago and the folder is a leaf
# For each of these children, delete them and increment $folderCount
function Remove-LeafFolders($path) {
$createdCount = 0
$modifiedCount = 0
Write-ToLogFile "Operation started at $(Get-Date -format "dd/MM/yyyy hh:mm:ss.fff")"
Write-ToLogFile "Looking in $proj_loc"
Write-ToLogFile ""
$start = $(Get-Date)
$proj_loc |
Get-ChildItem -recurse -force |
?{
$_.PSIsContainer -and ($_.CreationTime).AddDays(15) -lt $(Get-Date) -and $(Folder-IsLeaf $_.FullName) -eq $true
} | %{
$formattedDate = $($_.CreationTime).ToString("dd/MM/yyyy hh:mm:ss");
Write-ToLogFile "Folder $($_.FullName) is being removed; created: $formattedDate"
Remove-Item $_.FullName -recurse;
$createdCount += 1
}
$end = $(Get-Date)
$elapsed = $end - $start
Write-ToLogFile "Operation completed at $(Get-Date -format "dd/MM/yyyy hh:mm:ss.fff")."
Write-ToLogFile "Folders removed: $createdCount"
Write-ToLogFile "Time elapsed: $(($elapsed).TotalMilliseconds) ms"
Write-ToLogFile "-------------------------------"
}
Remove-LeafFolders($proj_loc)

I found this other StackOverflow question, and, after looking through the answer, I realised that the problem was the pipeline. So, I changed my code as follows:
...
$leafList = $proj_loc |
Get-ChildItem -recurse -force |
?{
$_.PSIsContainer -and ($_.CreationTime).AddDays(15) -lt $(Get-Date) -and $(Folder-IsLeaf $_.FullName) -eq $true
}
Foreach ($folder in $leafList)
{
$formattedDate = $($folder.CreationTime).ToString("dd/MM/yyyy hh:mm:ss");
Write-ToLogFile "Folder $($folder.FullName) is being removed; created: $formattedDate"
Remove-Item $folder.FullName -recurse;
$createdCount += 1
}
...
I created a few local folders and screwed around with them. No exceptions cropped up, so this appears to have worked:
Operation started at 10/12/2012 05:16:18.631
Looking in C:\foo C:\bar
Folder C:\foo\baz is being removed; created: 09/01/2010 02:00:00
Folder C:\bar\baz3\recursion is being removed; created: 01/01/2008 01:00:00
Operation completed at 10/12/2012 05:16:18.748.
Folders removed: 2
Time elapsed: 33.0033 ms
-------------------------------
Operation started at 10/12/2012 05:41:59.246
Looking in C:\foo C:\bar
Folder C:\foo\baz2\NewFolder is being removed; created: 10/10/2010 10:10:10
Folder C:\bar\baz3\barbar is being removed; created: 20/11/2012 05:37:38
Operation completed at 10/12/2012 05:41:59.279.
Folders removed: 2
Time elapsed: 21.0021 ms
-------------------------------

I think it would be easier to accomplish this sort of recursive deletion using a bottom-up approach rather than a top-down which is what Get-ChildItem gives you by default. Try this instead:
$proj_loc |
Get-ChildItem -recurse -force -Name | sort -desc | Get-Item |
? { ....
}

Related

Move multiple files from folder A to folder B

We need to move .csv files from folder where the are stored down to external server using powershell.
this is what i've tried so far but for some reason i only get message not copying and name of the files:
$DestinationFolder = "C:\d1\"
$SourceFolder = "C:\s1\"
If (-not (Test-Path $DestinationFolder) ) {
New-Item -ItemType Directory -Force -Path $DestinationFolder
}
$EarliestModifiedTime = (Get-Date).AddMinutes(200).Date # get's current time + adds 30 min
$LatestModifiedTime = (Get-Date).Date
echo($DestinationFolder); # will check time
Get-ChildItem "C:\s1\*.*" |
ForEach-Object {
if ( ($_.CreationTime -ge $EarliestModifiedTime) -and ($_.CreationTime -lt $LatestModifiedTime) ){ # i.e., all day yesterday
Copy-Item $_ -Destination $DestinationFolder -Force
Write-Host "Copied $_" }
else {
Write-Host "Not copying $_"
}
}
does it work if you simplify it and just try for a test (e.g. pick one file rather than trying to run for a delta of every 30 mins / last day)? Just thinking first you need to see if the problem is with accessing (or the formatting) of your source / destination directories or the delta logic itself. Perhaps some more error conditions would help....

Power shell script to delete some old data to free up space upto certain limit

I am new to power shell script. I have a Power shell script to Check the available disk space and delete some old subfolders from a folder until the free space reaches threshold level.
My code deletes all the folders and it does not exit anywhere. I am checking if free space is greater than available space and trying to terminating it. ( $part.FreeSpace -gt $desiredBytes)..
I have echoed $desiredBytes, $part.FreeSpace, $directoryInfo.count. Even if a folder of huge size gets deleted, the values of these variables are not getting updated. So, All the folders are getting deleted and still it does not terminate.
Could someone help me with this. Thanks in Advance :)
[WMI]$part = "Win32_LogicalDisk.DeviceID='D:'"
$directory = "D:\Suba\Suba\"
$desiredGiB = 262
$desiredBytes = $desiredGiB * 1073741824
$directoryInfo = Get-ChildItem $directory | Measure-Object
$directoryInfo.count #Returns the count of all of the objects in the directory
do{
if($part.FreeSpace -gt $desiredBytes)
{ exit
}
if ($directoryInfo.count -gt 0) {
echo $desiredBytes
echo $part.FreeSpace
echo $directoryInfo.count
foreach ($root in $directory) {
Get-ChildItem $root -Recurse |
Sort-Object CreationTime |
Select-Object -First 1 |
Remove-Item -Force
}
}
else
{
Write-Host "Enough Files are not there in this directory!!"
exit
}
}
while($part.FreeSpace -lt $desiredBytes)
The problem is, that you need to make the WMI query again, every time after deleting a subfolder, so the free space is updated.
Check out my version:
$drive = "D:"
$directory = "$drive\Suba\Suba"
$desiredSpace = 262 * 1gb
$subfolders = [System.Collections.ArrayList]#(Get-ChildItem $directory | where { $_.PSIsContainer } | sort LastWriteTime)
while (([wmi]"Win32_LogicalDisk.DeviceID='$drive'").FreeSpace -lt $desiredSpace) {
if ($subfolders.Count -eq 0) {
Write-Warning "Not enough sub-folders to delete."
break
}
Remove-Item $subfolders[0].FullName -Recurse -Force -Confirm:$false
$subfolders.RemoveAt(0)
}

Renaming Folders ascending

I want to automate my backups and Keep always some old versions. The idea was to use Windows Backup on a share and use a PowerShell script to start it.
I'm almost done, but I stuck at the renaming.
Share: \\Server\SysBackup\WindowsImageBackup
in that share there are Folders for all PC's I have. So for example I want to Keep the three last backups, it should to the following:
Current Backup: PC1
Old Backups: PC1_1, PC1_2
Now I want to rename them to one higher number
PC1_2 → PC1_3
PC1_1 → PC1_2
PC1 → PC1_1
So the backup now can use the PC1 Folder for the newest backup.
That's what I tried so far:
$BackupFolder = Get-ChildItem -Directory ($Target + "\WindowsImageBackup") |
Where-Object -Property Name -Like $env:computername* |
Select-Object -Property Name
$CurrentBackups = $BackupFolder.Count
if ($CurrentBackups -ge 1) {
Push-Location ($Target + "\WindowsImageBackup")
$i = 0
$xCurrentArray = $CurrentBackups - 1
$NewSubVersion = $CurrentBackups
while ($i -le $CurrentBackups) {
$NewName = $BackupFolder[$xCurrentArray].Name.TrimEnd("_")
Rename-Item $BackupFolder[$xCurrentArray] -NewName
}
Pop-Location
Clear-Variable $i
}
The files are not renamed, and I'm getting the following error:
You cannot call a method on a null-valued expression.
Rename-Item : Missing an argument for parameter 'NewName'. Specify a parameter of type 'System.String' and try again.
Where is my mistake?
I found the error
if ($CurrentBackups -ge 1)
{
Push-Location ($Target + "\WindowsImageBackup\")
$i= 0
$xCurrentArray = $CurrentBackups - 1
$NewSubVersion = $CurrentBackups
while ($i -lt $CurrentBackups)
{
if ($BackupFolder[$xCurrentArray].Contains("_"))
{
$Index = $BackupFolder[$xCurrentArray].IndexOf("_")
$NewName = $BackupFolder[$xCurrentArray].Substring(0, $Index)
Rename-Item $BackupFolder[$xCurrentArray] -NewName ($NewName + "_" + $NewSubVersion)
Clear-Variable Index
$NewSubVersion--
}
else
{
$NewName = $BackupFolder[$xCurrentArray]
Rename-Item $BackupFolder[$xCurrentArray] -NewName ($NewName + "_1")
}
$i++
$xCurrentArray--
}
Pop-Location
Clear-Variable i
}

A script accept a path as parameter and then delete all empty dirs needs some enhancement

With help from #Shay Levy I have such a script:
param (
[parameter (mandatory=$true,position=0)]
[string]$Path
)
Get-ChildItem $Path -Recurse -Force |
Where-Object {$_.PSIsContainer -and (Get-ChildItem $_.FullName -Force | Measure-Object).Count -eq 0} | Remove-Item
What it does is: accept a path as parameter - find all dirs recursively - if it is a empty folder (no childitem underneath), it will get removed. It works fine but I want to make it perfect.
Example to explain a special situation: I have a folder as such folder structure: folder/subfolder1.1/subfolder2.1/subfolder3.1. Every sub folder is the only child item underneath its parent. I have a dir called "folder", and it contains only one subfolder "subfolder1.1" and no files. "Subfolder1.1" contains only one subfolder "subfolder2.1" and no files. "subfolder2.1" contains "subfolder3.1" and no files. After running this script, what really gets removed is "subfolder3.1" which makes sense.
Here comes something: After "subfolder3.1" being removed its parent folder "subfolder2.1" becoming empty folder and can be removed. But since the script has passed that point, "subfolder2.1" can not be removed until I run the script one more time.
The perfect script will: remove "subfolder3.1" first, then checks its parent folder and find "Oh, its parent subfolder2.1 is empty too now, let's delete it parent folder subfolder2.1". After removing subfolder2.1, then checks its parent and find "Oh, subfolder1.1 is empty now, let's delete it". Eventually, after all sub folders get removed, the top level of this structure "folder" will get removed since it is empty.
I add a "sort -descending" in the script and it seems doesn't do anything for me.
param (
[parameter (mandatory=$true,position=0)]
[string]$Path
)
Get-ChildItem $Path -Recurse -Force |
Where-Object {$_.PSIsContainer -and (Get-ChildItem $_.FullName -Force | Measure-Object).Count -eq 0} |
Sort -descending |Remove-Item
The logics adding "sort" command is: The above folder structure is piped in this order:
folder
folder/subfolder1.1
folder/subfolder1.1/subfolder2.1
folder/subfolder1.1/subfolder2.1/subfolder3.1
In above order, after "folder/subfolder1.1/subfolder2.1/subfolder3.1" removed, 2.1 and 1.1 and folder can not be removed since the script has passed the points. So I was putting "sort -descending" in the script hoping this folder structure be piped this in this order:
folder/subfolder1.1/subfolder2.1/subfolder3.1
folder/subfolder1.1/subfolder2.1
folder/subfolder1.1
folder
My hope is that it remove 3.1 first, then it can remove 2.1 since 2.1 becoming empty after 3.1 getting removed. And so on. This dream is beautiful but it doesn't work.
==================================
Update1:
Apr 25, 2013: #mjolinor Thank you for help. I run it right away and get such an error:
...............
+ Foreach {
+ ~
Missing opening '(' after keyword 'foreach'.
...............
I fixed it by modifying this line in first foreach loop:
...............
Foreach ($dir in $dirs) {
...............
Now the above error is gone and moved one step further. When I can run it I can see it is iterating folders but ended up in another error:
...............
Index operation failed; the array index evaluated to null.
At C:\Documents\ManualScripts\Cleanup-no-file-and-subdir-dir-rev02.ps1:11 char:13
+ $dirs[$_] = (Get-ChildItem $_ -Force | Measure-Object).Count
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : InvalidOperation: (:) [], RuntimeException
+ FullyQualifiedErrorId : NullArrayIndex
...............
Working on it and will share with you.
=============================
Update2:
Apr 25, 2013
The reason for the above error is this line:
...........
Foreach ($dir in $dirs) {
...........
Change it into:
...........
Foreach ($dir in $_) {
...........
Error is gone. I can see the list of dirs scrolling down the console. But it seems like "sort" did not work as I expected since the test folder still show up "ascending" order. The excerpt is as following:
...........
H:\archive\folder\Sub1
H:\archive\folder\Sub1\Sub2.1
H:\archive\folder\Sub1\Sub2.1\Sub3.1
...........
"folder\Sub1\Sub2.1\Sub3.1" is the test folder structure I created.
===================================
Update 3
Apr 26, 2013.
It works now!
.........................................
param (
[parameter (mandatory = $True, position = 0)]
[string]$Path
)
$dirs = #{}
Get-ChildItem $Path -Recurse -Force |
Where-Object {$_.PSIsContainer} |
Select -ExpandProperty FullName |
Foreach {
$dirs[$_] = (Get-ChildItem $_ -Force | Measure-Object).Count
}
$dirs.keys | Sort Length -Descending # This line is only for displaying output to mornitor sorting is all right
$dirs.keys | Sort Length -Descending |
Foreach {
If ($dirs[$_] -eq 0)
{
Remove-item $_
$dirs[($_ | Split-Path -Parent)]--
}
}
...................................
Thanks #mjolinor . I like the hash table :)
Not tested, but I think this should work:
param (
[parameter (mandatory=$true,position=0)]
[string]$Path
)
$dirs = #{}
Get-ChildItem $Path -Recurse -Force |
Where-Object {$_.PSIsContainer} |
select -ExpandProperty FullName
foreach {
$dirs[$_] = (Get-ChildItem $_ -Force | Measure-Object).Count
}
$dirs.keys | Sort length -descending |
foreach {
if ($dirs[$_] -eq 0)
{
Remove-Item $_
$dirs[($_ | Split-Path -Parent)]--
}
}

creating powershell script to backup a file and append the date

Currently I have a one line batch file to back up a file. I run it manually when I need to back up a file. The only thing I would like to add to it is the current date. Here is what I have:
xcopy /W /Y ACTIVE.DB ACTIVE.DB.BACKUP
the destination file should simply be ACTIVE.DB.BACKUP.YYYYMMDD. How would I go about creating a script that will allow me to double click on it from Windows Explorer and make the xcopy happen?
Just to point out that you can do this with Copy-Item e.g.:
Set-Location $path
Copy-Item ACTIVE.DB "ACTIVE.DB.$(get-date -f yyyyMMdd)" -Force -Confirm
If you're going for robust then I'd use robocopy.exe.
You can customize your filename by embedding a formatted [datetime]::now in the file name in PowerShell like so:
xcopy /W /Y ACTIVE.DB "ACTIVE.DB.BACKUP.$([datetime]::now.ToString('yyyy-MM-dd'))"
If the line feels busy and unmaintainable, you can refactor it to multiple lines:
$now = [datetime]::now.ToString('yyyy-MM-dd')
xcopy /W /Y ACTIVE.DB "ACTIVE.DB.BACKUP.$now"
To get double-click execution, I usually make a batch file that runs the PowerShell command as described here:
Set up PowerShell Script for Automatic Execution
I just made a Daily/Weekly/Monthly/Quarterly/Yearly backup script in Powershell this month, and hope it helps.
This DWMQY backup scenario is to zip a source folder to a date-named zip file, then keep the zips of:
last 7 days
4 weeks (each Friday's)
6 months (the last Friday's of each month)
4 quarters (the last month's of quarter)
2 years (the last quarter's of year).
Running as scheduled task on daily basis, it puts the zips into a target folder which is also a Microsoft OneDrive's local folder, so the zips also being remotely sync'd to OneDrive server. Those outdated (non-Friday daily or non-last-DWMQY) zips will be moved to a non-remotely-sync'd folder.
It's March 5, 2016 today, the following zips should be in the target folder:
7 days: 160304-160229-160227
4 weeks: 160304, 160226, 160219,160212
6 months: 160226, 160129, 161225, 151127, 151025, 150925
4 quarters: 151225, 150925,150626,150327
2 years: 151225, 141226
So there will be 23 zips (actually less since the dups among DWMQY), our files are 250 text documents which is 0.4 GB after zipping, so it's 23*0.4 = 9.2 GB in total, which is less than OneDrive free 15 GB quota.
For large source data, 7-zip can be used, which provides maximum 16 mil TB zip size. For directly backup folders instead of zips, haven't tried. Guessing it's a transferable procedure from the current zip way.
# Note: there are following paths:
# 1. source path: path to be backed up.
# 2. target path: current zips stored at, which is also a remote-sync pair's local path.
# 3. moved-to path: outdated zips to be moved in this non-sync'able location.
# 4. temp path: to copy the source file in to avoid zip.exe failing of compressing them if they are occupied by some other process.
# Function declaration
. C:\Source\zipSaveDated\Functions.ps1
# <1> Zip data
$sourcePath = '\\remoteMachine1\c$\SourceDocs\*'
$TempStorage = 'C:\Source\TempStorage'
$enddate = (Get-Date).tostring("yyyyMMdd")
$zipFilename = '\\remoteMachine2\d$\DailyBackupRemote\OneDrive\DailyBackupRemote_OneDrive\' + $enddate + '_CompanyDoc.zip'
Remove-Item ($TempStorage + '\*') -recurse -Force
Copy-Item $sourcePath $TempStorage -recurse -Force
Add-Type -A System.IO.Compression.FileSystem
[IO.Compression.ZipFile]::CreateFromDirectory($TempStorage, $zipFilename)
# <2> Move old files
$SourceDir = "\\remoteMachine2\d$\DailyBackupRemote\OneDrive\DailyBackupRemote_OneDrive"
$DestinationDir = "\\remoteMachine2\d$\DailyBackupRemote\bak" # to store files moved out of the working folder (OneDrive)
$KeepDays = 7
$KeepWeeks = 4
$KeepMonths = 6
$KeepQuarters = 4
$KeepYears = 2
# <2.1>: Loop files
$Directory = $DestinationDir
if (!(Test-Path $Directory))
{
New-Item $directory -type directory -Force
}
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# daily removal will not remove weekly copy, 7
If($file.LastWriteTime -lt (Get-Date).adddays(-$KeepDays).date `
-and $file.LastWriteTime.DayOfWeek -NotMatch "Friday" `
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# weekly removal will not remove monthly copy, 4
If($file.LastWriteTime -lt (Get-Date).adddays(-$KeepWeeks * 7).date `
-and (Get-LastFridayOfMonth ($file.LastWriteTime)).Date.ToString("yyyyMMdd") -NotMatch $file.LastWriteTime.Date.ToString("yyyyMMdd")
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# monthly removal will not remove quarterly copy, 6
If($file.LastWriteTime.Month -lt ((Get-Date).Year - $file.LastWriteTime.Year) * 12 + (Get-Date).Month - $KeepMonths `
-and $file.LastWriteTime.Month -NotIn 3, 6, 9, 12
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# quarterly removal will not remove yearly copy, 4
If($file.LastWriteTime.Month -lt ( (Get-Date).Year - $file.LastWriteTime.Year) * 12 + (Get-Date).Month - $KeepQuarters * 3 `
-and $file.LastWriteTime.Month -NotIn 12
)
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
$files = get-childitem $SourceDir *.*
foreach ($file in $files)
{ # L1
# yearly removal will just go straight ahead. 2
If($file.LastWriteTime.Year -lt (Get-Date).Year - $KeepYears )
{
Move-Item $file.fullname $Directory -force
}
} # L1 >>
<Functions.ps1>
function Get-TimesResult3
{
Param ([int]$a,[int]$b)
$c = $a * $b
Write-Output $c
}
function Get-Weekday {
param(
$Month = $(Get-Date -format 'MM'),
$Year = $(Get-Date -format 'yyyy'),
$Days = 1..5
)
$MaxDays = [System.DateTime]::DaysInMonth($Year, $Month)
1..$MaxDays | ForEach-Object {
Get-Date -day $_ -Month $Month -Year $Year |
Where-Object { $Days -contains $_.DayOfWeek }
}
}
function Get-LastFridayOfMonth([DateTime] $d) {
$lastDay = new-object DateTime($d.Year, $d.Month, [DateTime]::DaysInMonth($d.Year, $d.Month))
$diff = ([int] [DayOfWeek]::Friday) - ([int] $lastDay.DayOfWeek)
if ($diff -ge 0) {
return $lastDay.AddDays(- (7-$diff))
}
else {
return $lastDay.AddDays($diff)
}
}