Have folder which has backups of SQL databases with backup date in the name.
e.g. C:\Backup folder.
Example of backup files:
archive_1_01022022.bak
archive_1_02022022.bak
archive_1_03022022.bak
archive_2_01022022.bak
archive_2_02022022.bak
archive_2_03022022.bak
archive_3_01022022.bak
archive_3_02022022.bak
archive_3_03022022.bak
I need powershell script which removes all files from this directory but keeps recent ones (e.g. for last 5 days), but at the same time I need to keep at least 3 copies of each database (in case there are no backups done for more than last 5 days).
Below script removes all files and keeps recent ones for last 5 days:
$Folder = "C:\Backup"
$CurrentDate = Get-Date
$DateDel = $CurrentDate.AddDays(-5)
Get-ChildItem $Folder | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item
Above is wokring fine, but if there are no recent backups for last 10 days and if I run above code then it will remove all files in C:\Backup. For such cases I need to keep at least 3 backup files of each databases.
If I use below code (for example I have 9 different databases), then it do job:
$Folder = "C:\Backup"
Get-ChildItem $Folder | ? { -not $_.PSIsContainer } |
Sort-Object -Property LastWriteTime -Descending |
Select-Object -Skip 27 |
Remove-Item -Force
But implementation is weird. For example if I have backups of 9 databases, then I need to provide "Select-Object -Skip" with value 27 (9 databases x skip 3 files of each database). In case I have more databases or less, then each time I need to adjust this number. How can I make "Select-Object -Skip 3" static value?
In that case, you need to test how many files with a newer or equal date compared to the reference date there are in the folder. If less than 3, sort them by the LastWriteTime property and keep the top 3. If you have enough newer files left, you can delete the old ones:
$Folder = "C:\Backup"
$DateDel = (Get-Date).AddDays(-5).Date # set to midnight
# get a list of all backup files
$allFiles = Get-ChildItem -Path $Folder -Filter 'archive*.bak' -File
# test how many of these are newer than 5 days ago
$latestFiles = #($allFiles | Where-Object { $_.LastWriteTime -ge $DateDel })
if ($latestFiles.Count -lt 3) {
# if less than three keep the latest 3 files and remove the rest
$allFiles | Sort-Object LastWriteTime -Descending | Select-Object -Skip 3 | Remove-Item -WhatIf
}
else {
# there are plenty of newer files, so we can remove the older ones
$allFiles | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item -WhatIf
}
I have added the -WhatIf safety switch to both Remove-Item cmdlets, so you can first see what would happen before actualy destroying files. Once you are satisfied with what the console shows, remove those -WhatIf switches and run again
If you have 9 databases and the number in the filename after archive_ makes the distinction between those database backup files, just put the above inside a loop and adjust the -Filter:
$Folder = "C:\Backup"
$DateDel = (Get-Date).AddDays(-5).Date # set to midnight
# loop through the 9 database files
for ($i = 1; $i -le 9; $i++) {
# get a list of all backup files per database
$allFiles = Get-ChildItem -Path $Folder -Filter "archive_$($i)_*.bak" -File
# test how many of these are newer than 5 days ago
$latestFiles = #($allFiles | Where-Object { $_.LastWriteTime -ge $DateDel })
if ($latestFiles.Count -lt 3) {
# if less than three keep the latest 3 files and remove the rest
$allFiles | Sort-Object LastWriteTime -Descending | Select-Object -Skip 3 | Remove-Item -WhatIf
}
else {
# there are plenty of newer files, so we can remove the older ones
$allFiles | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item -WhatIf
}
}
Ok, so now we know the example names you gave do not bare resemblance with the real names, the code could be as simple as this:
$dbNames = 'archive', 'master', 'documents', 'rb' # the names used in the backup files each database creates
$Folder = "C:\Backup"
$DateDel = (Get-Date).AddDays(-5).Date # set to midnight
# loop through the database files
foreach ($name in $dbNames) {
# get a list of all backup files per database
$allFiles = Get-ChildItem -Path $Folder -Filter "$($name)_*.bak" -File
# test how many of these are newer than 5 days ago
$latestFiles = #($allFiles | Where-Object { $_.LastWriteTime -ge $DateDel })
if ($latestFiles.Count -lt 3) {
# if less than three keep the latest 3 files and remove the rest
$allFiles | Sort-Object LastWriteTime -Descending | Select-Object -Skip 3 | Remove-Item -WhatIf
}
else {
# there are plenty of newer files, so we can remove the older ones
$allFiles | Where-Object { $_.LastWriteTime -lt $DateDel } | Remove-Item -WhatIf
}
}
Basing on the assumption that your backups have a name convention of : DBNAME_ddMMyyyy.bak where the date correspond to the backup date, I would do something like below.
$Params = #{
MinBackupThresold = 1
MinBackupDays = 5
SimulateDeletion = $False # Set to true to perform a Remove-Item -WhatIf deletion}
$Folder = "C:\temp\test"
$CurrentDate = Get-Date
$DateDel = $CurrentDate.AddDays($Params.MinBackupDays).Date # set to midnight
$Archives = Foreach ($File in Get-ChildItem $Folder ) {
# -13 come from assuming naming convention DBName_8CharBackupDate.ext (eg: Db1_01012022.bak)
$DbNameEndIndex = $File.Name.Length - 13
# +1 since our naming convention have an underscore between db name and date.
$RawDateStr = $File.Name.Substring($DbNameEndIndex + 1 , 8)
[PSCustomObject]#{
Path = $FIle.FullName
LastWriteTime = $File.LastWriteTime
DBName = $File.Name.Substring(0, $DbNameEndIndex)
BackupDate = [datetime]::ParseExact( $RawDateStr, 'ddMMyyyy', $null)
}
}
#Here we group archives by their "dbname" so we can make sure to keep a min. backups for each.
$GroupedArchives = $Archives | Group DBName
Foreach ($Db in $GroupedArchives) {
if ($Db.Count -gt $Params.MinBackupThresold) {
$Db.Group | Sort BackupDate | Select-Object -Skip $Params.MinBackupThresold | Where-Object { $_.BackupDate -lt $DateDel } | % { Remove-Item -Path $_.Path -Force -WhatIf:$Params.SimulateDeletion }
} else {
# You could include additional checks to verify last backup, alert you if there should be more in there, etc...
}
}
Note: Using the date extracted from the filename will be more accurate than the lastwritetime, which could be updated for other reasons (Since we have it, might as well use it.)
Note 2 : Added WhatIf in the $params so you can easily switch between actual removal and simulation (Theo's answer gave me the idea of providing that switch) and his .Date to make sure the date was set to midnight instead of current time of day.
Related
Anyone here can help me how to repeat the code from the beginning after all iteration in foreach loop has been completed. The code below will get all the files having 'qwerty' pattern inside the file, feed the list in foreach loop and display the filename and last 10 lines on each file and terminate the code if there is no new/updated file in certain amount of time
$today=(Get-date).Date
$FILES=Get-ChildItem -Path C:\Test\ | `
Where-Object {$_.LastWriteTime -ge $today} | `
Select-String -pattern "qwerty" | `
Select-Object FileName -Unique
foreach ($i in $FILES) {
Write-host $i -foregroundcolor red
Get-content -Path \\XXXXXX\$i -tail 10
Start-Sleep 1
}
You can use this:
For ($r = 0; $r -eq NumberOfTimesYouWantToRepeat; $r++) {
$today=(Get-date).Date
$FILES=Get-ChildItem -Path C:\Test\ | `
Where-Object {$_.LastWriteTime -ge $today} | `
Select-String -pattern "qwerty" | `
Select-Object FileName -Unique
foreach ($i in $FILES) {
Write-host $i -foregroundcolor red
Get-content -Path \\XXXXXX\$i -tail 10
Start-Sleep 1
}
}
PS: Replace TheNumberOfTimesToRepeat with the number of time you want to repeat.
If I understand the question properly, you would like to test for files in a certain folder containing a certain string. For each of these files, the last 10 lines should be displayed.
The first difficulty comes from the fact that you want to do this inside a loop and test new or updated files.
That means you need to keep track of files you have already tested and only display new or updated files. The code below uses a Hashtable $alreadyChecked for that so we can test if a file is either new or updated.
If no new or updated files are found during a certain time, the code should end. To do that, I'm using two other variables: $endTime and $checkTime.
$checkTime gets updated on every iteration, making it the current time
$endTime only gets updated if files were found.
$today = (Get-Date).Date
$sourceFolder = 'D:\Test'
$alreadyChecked = #{} # a Hashtable to keep track of files already checked
$maxMinutes = 5 # the max time in minutes to perform the loop when no new files or updates are added
$endTime = (Get-Date).AddMinutes($maxMinutes)
do {
$checkTime = Get-Date
$files = Get-ChildItem -Path $sourceFolder -File |
# only files created today and that have not been checked already
Where-Object {$_.LastWriteTime -ge $today -and
(!$alreadyChecked.ContainsKey($_.FullName) -or
$alreadyChecked[$_.FullName].LastWriteTime -ne $_.LastWriteTime) } |
ForEach-Object {
$filetime = $_.LastWriteTime
$_ | Select-String -Pattern "qwerty" -SimpleMatch | # -SimpleMatch if you don't use a Regex match
Select-Object Path, FileName, #{Name = 'LastWriteTime'; Expression = { $filetime }}
}
if ($files) {
foreach ($item in $files) {
Write-Host $item.Filename -ForegroundColor Red
Write-Host (Get-content -Path $item.Path -Tail 10)
Write-Host
# update the Hashtable to keep track of files already done
$alreadyChecked[$item.Path] = $item | Select-Object FileName, LastWriteTime
Start-Sleep 1
}
# files were found, so update the time to check for no updates/new files
$endTime = (Get-Date).AddMinutes($maxMinutes)
}
# exit the loop if no new or updated files have been found during $maxMinutes time
} while ($checkTime -le $endTime)
For demo, I'm using 5 minutes to wait for the loop to expire if no new or updated files are found, but you can change that to suit your needs.
I need to delete all the archived files and folder older than 15 days.
I have implemented the solution using PowerShell script but it taking more than a day to delete all files. Total size of the folder is less than 100 GB.
$StartFolder = "\\Guru\Archive\"
$deletefilesolderthan = "15"
#Get Foldernames for ForEach Loop
$SubFolders = Get-ChildItem -Path $StartFolder |
Where-Object {$_.PSIsContainer -eq "True"} |
Select-Object Name
#Loop through folders
foreach ($Subfolder in $SubFolders) {
Write-Host "Processing Folder:" $Subfolder
#For each folder recurse and delete files olders than specified number of days while the folder structure is left intact.
Get-ChildItem -Path $StartFolder$($Subfolder.name) -Include *.* -File -Recurse |
Where LastWriteTime -lt (Get-Date).AddDays(-$deletefilesolderthan) |
foreach {$_.Delete()}
#$dirs will be an array of empty directories returned after filtering and loop until until $dirs is empty while excluding "Inbound" and "Outbound" folders.
do {
$dirs = gci $StartFolder$($Subfolder.name) -Exclude Inbound,Outbound -Directory -Recurse |
Where {(gci $_.FullName).Count -eq 0} |
select -ExpandProperty FullName
$dirs | ForEach-Object {Remove-Item $_}
} while ($dirs.Count -gt 0)
}
Write-Host "Completed" -ForegroundColor Green
#Read-Host -Prompt "Press Enter to exit"
Please suggest some way to optimise the performance.
If you have many smaller files, the long delete time is not abnormal because it has to process each file descriptor. Some improvements can be made depending on your version; I'm going to assume you're on at least v4.
#requires -Version 4
param(
[string]
$start = '\\Guru\Archive',
[int]
$thresholdDays = 15
)
# getting the name wasn't useful. keep objects as objects
foreach ($folder in Get-ChildItem -Path $start -Directory) {
"Processing Folder: $folder"
# get all items once
$folders, $files = ($folder | Get-ChildItem -Recurse).
Where({ $_.PSIsContainer }, 'Split')
# process files
$files.Where{
$_.LastWriteTime -lt (Get-Date).AddDays(-$thresholdDays)
} | Remove-Item -Force
# process folders
$folders.Where{
$_.Name -notin 'Inbound', 'Outbound' -and
($_ | Get-ChildItem).Count -eq 0
} | Remove-Item -Force
}
"Complete!"
The reason why it takes so many time is that you are deleting files/folder over network which leads to need for additional network communication for every file and folder. You can easily check that fact using network analyzer. The best approach here is to use one of the method that allows to run code which executes file operations on remote machine, for example you can try to use:
WinRM
psexec (first copy code to remote machine and then execute it using psexec)
remote WMI (using CIM_Datafile)
or even adding needed task to the scheduler
I would prefer to use WinRM but psexec is also good decision (if you don't want to perform additional configuration of WinRM).
I'm trying to create a powershellscript to schedule backup delete so that the HDD doesn't get full.
What I want to do is to verify which file that is the newest
Afterwards I want to check if the filesize doesn't different more than 10% from the second newest file.
If the filesize is within the size range then delete all but the newest ones.
If the filesize is smaller or bigger than 10% of the second newest file then delete all but the newest and the second newest file.
I would like you guys to help me out how I should think to formulate the code to make this to work.
I've started with below which deletes all files older than 2 days but I'm not quite sure how to change that to keep the newest file not depending of days.
$Path = "C:\Temp\Backup\Folder1\"
$Days = 2
$Date = Get-Date
$Include = "*.gpg"
$Exclude = "*.txt"
Get-ChildItem $Path -Recurse |
Where-Object {-not $_.PSIsContainer -and $Date.Subtract($_.CreationTime).Days -gt $Days } |
Remove-Item -WhatIf
You could do something like this:
$BackupFiles = Get-ChildItem -File | Sort-Object LastWriteTime -Descending
$LatestBackup = $BackupFiles | Select -First 1
$PrevBackup = $BackupFiles | Select -Skip 1 -First 1
$BackupSizeThreshold = $PrevBackup.Length * 0.1
$FilesToRemove = If ($LatestBackup.Length -le ($PrevBackup.Length + $BackupSizeThreshold) -and $LatestBackup.Length -ge ($PrevBackup.Length - $BackupSizeThreshold)) {
$BackupFiles | Select -Skip 1
}
Else {
$BackupFiles | Select -Skip 2
}
$FilesToRemove | Remove-Item -WhatIf
Remove the -WhatIf if you're seeing the results you expect.
I'm using a PowerShell script to retrieve a file from a remote directory. I only want to retrieve a file if it was modified within the last hour. I was able to get the most recent file using the following code:
$directoryInfo = $session.ListDirectory($remotePath)
$latest =
$directoryInfo.Files |
Where-Object { -Not $_.IsDirectory } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
I believe that I need to add another condition to the Where-Object clause, but I don't know the proper format. For example,
Where-Object { -Not $_.IsDirectory and <created/modified within the last hour> }
How do I do this? Is there a better/simpler way?
Extend you current where-block to check if LastWriteTime is greater (newer) than a datetime-object representing the previous hour. Ex:
$lasthour = (Get-Date).AddHours(-1)
$directoryInfo = $session.ListDirectory($remotePath)
$latest = $directoryInfo.Files |
Where-Object { (-Not $_.IsDirectory) -and ($_.LastWriteTime -gt $lasthour) } |
Sort-Object LastWriteTime -Descending |
Select-Object -First 1
If you want to download all files created/modified within the last hour, use:
$directoryInfo = $session.ListDirectory($remotePath)
$limit = (Get-Date).AddHours(-1)
$recentFiles =
$directoryInfo.Files |
Where-Object { (-Not $_.IsDirectory) -And ($_.LastWriteTime -Gt $limit) }
foreach ($fileInfo in $recentFiles)
{
$sourcePath = [WinSCP.RemotePath]::EscapeFileMask($fileInfo.FullName)
$session.GetFiles($sourcePath, $localPath + "\*").Check()
}
Some official WinSCP .NET assembly examples used to make the code:
Downloading the most recent file
Listing files matching wildcard
I deploy custom code to thousands of computers and have been able to get the return code to function correctly for one or two objects in the tool I have to use to push out code. But I am looking for a way of setting up a file validator - because the Dev folks don't consistently use version numbering I have been able to use the below code to check for the date stamp for each object.
Code:
$foo1= Get-ChildItem "C:\path\file1.exe" | Where-Object {$_.LastWriteTime -gt "11/1/2013"} | Out-String
$foo2= Get-ChildItem "C:\path\file2.exe" | Where-Object {$_.LastWriteTime -gt "9/10/2013"} | Out-String
$foo3= Get-ChildItem "C:\path\file3.exe" | Where-Object {$_.LastWriteTime -gt "4/23/2013"} | Out-String
$foo4= Get-ChildItem "C:\path\file4.exe" | Where-Object {$_.LastWriteTime -gt "12/17/2012"} | Out-String
The above works but will show the object name and the last write time. I can write the exit code with this code:
if($foo1){Write-Host '0'
}else{
Write-Host '5'
Exit 5
}
Is there a way I can state if foo1 exists (i.e. not $null) then read it as a 0 and if it is null read it as a 1 and then state that $foochecksum = $foo1 + $foo2 + $foo3 + $foo4 and do the If Else cited above just once to write the exit code to my deployment tool?
Functionally what I am looking for is a way of checking multiple file date / time stamps and then if all are good passing one 0 to the If/Else statement that will write a pass or fail to my deployment tool.
I could use multiple if/else's if need be but will need to check something like 40 files and would rather not have to have 40 different IF/Else statements.
I would also love to have something that might work in PS V2 and V3 as I have a mix of 2003 and 2008 servers in prod.
Thanks,
Dwight
Use a variable to hold the "error state" of your script, and a HashTable to hold the Path and LastWriteTime values for each file that you are "testing."
$ErrorExists = $false;
# Declare some file/lastwritetime pairs
$FileList = #{
1 = #{ Path = 'C:\path\file1.exe';
LastWriteTime = '11/1/2013'; };
2 = #{ Path = 'C:\path\file2.exe';
LastWriteTime = '9/10/2013'; };
3 = #{ Path = 'C:\path\file3.exe';
LastWriteTime = '4/23/2013'; };
4 = #{ Path = 'C:\path\file4.exe';
LastWriteTime = '12/17/2012'; };
};
foreach ($File in $FileList) {
# If LastWriteTime is LESS than the value specified, raise an error
if ((Get-Item -Path $File.Path).LastWriteTime -lt $File.LastWriteTime) {
$ErrorExists = $true;
}
}
if ($ErrorExists) {
# Do something
}
Maybe something like this?
$foos = &{
Get-ChildItem "C:\path\file1.exe" | Where-Object {$_.LastWriteTime -gt "11/1/2013"} | select -last 1
Get-ChildItem "C:\path\file2.exe" | Where-Object {$_.LastWriteTime -gt "9/10/2013"} | select -last 1
Get-ChildItem "C:\path\file3.exe" | Where-Object {$_.LastWriteTime -gt "4/23/2013"} | select -last 1
Get-ChildItem "C:\path\file4.exe" | Where-Object {$_.LastWriteTime -gt "12/17/2012"} | select -last 1
}
if ($foos.count -eq 4) {Write-Host '0'}
else {Write-Host '5';Return '5'}