Using get-children to list files with lastwritetime - powershell

With these lines of code:
get-childitem -Path d:\scripts –recurse |
where-object {$_.lastwritetime -gt (get-date).addDays(-1)} |
Foreach-Object { $_.FullName }
I get a list of everything under the d:\scripts directory that is less than 1 day old in time stamp. Output:
D:\scripts\Data_Files
D:\scripts\Power_Shell
D:\scripts\Data_Files\BackUp_Test.txt
D:\scripts\Power_Shell\archive_test_1dayInterval.ps1
D:\scripts\Power_Shell\stop_outlook.ps1
D:\scripts\Power_Shell\test.ps1
D:\scripts\WinZip\test.wjf
The deal is, the file folders (Data_Files & Power_Shell) have a last write with in the date param. I just want the files as in lines 3 - 7 in output.
Suggestions?

get-childitem -Path d:\scripts –recurse |
where-object {$_.lastwritetime -gt (get-date).addDays(-1)} |
where-object {-not $_.PSIsContainer} |
Foreach-Object { $_.FullName }
$_.PSIsContainer is true for folders, allowing the extra where-object filters them out.

gci d:\scripts –recurse |
? { $_.Attributes -band [System.IO.FileAttributes]::Archive } |
? { $_.LastWriteTime -gt (Get-Date).AddDays(-1) } |
foreach { $_.FullName }
or
gci d:\scripts –recurse |
? { -not ($_.Attributes -band [System.IO.FileAttributes]::Directory) } |
? { $_.LastWriteTime -gt (Get-Date).AddDays(-1) } |
foreach { $_.FullName }

Try this:
dir d:\scripts –recurse | where {!$_.PSIsContainer -AND $_.lastwritetime -gt (get-date).addDays(-1)} | foreach { $_.FullName }

List all files in all subdirectories and sort them by LastWriteTime (newest write at the end):
Get-ChildItem -Recurse | Sort-Object -Property LastWriteTime | Select-Object LastWriteTime,FullName

Related

List items in a directory over a certain age, then delete them

I'm sure I've missed something obvious, but it's been a while since I have needed to use PowerShell (n.b. it is version 2).
I need a basic script that deletes files over a certain age (3 days). I have the following:
$logDirectory = "C:\logs\"
$days = (Get-Date).AddDays(-3)
# Delete files older than the $days
Get-ChildItem -Path $logDirectory -Recurse |
Where-Object { !$_.PSIsContainer -and $_.LastWriteTime -lt $days } |
%{Write-Host File Found: $_.fullname $_.LastWriteTime}
Get-ChildItem -Path $logDirectory -Recurse |
Where-Object { !$_.PSIsContainer -and $_.LastWriteTime -lt $days } |
Remove-Item -Force
This works, but If I combine the two it doesn't. And I'm sure there must be a neater way to do this where I can write out a list of files, and then delete them. Something like:
Get-ChildItem -Path $logDirectory -Recurse |
Where-Object { !$_.PSIsContainer -and $_.LastWriteTime -lt $days } |
%{Write-Host File Found: $_.fullname $_.LastWriteTime} |
Remove-Item -Force
But all this does is list the items, not delete them.

Delete files older than 30 and save 1

I need to have a clean-up script that remove all files older than 30 days but if file is older than 30 days it should save the last one. Possible? :)
I have tried a couple of parameters but cannot really get it to work.. guess I need a if/else clause?
Would appreciate any guide and help with this, thanks
$Daysback = "-30"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays($Daysback)
$path = "C:\Data\*"
$save1 = Get-ChildItem -Path $path | Where-Object {($_.Name -like "Test*.zip")} | sort LastWriteTime -Descending | select -First
Get-ChildItem $path -Recurse
{($_.CreationTime -le $(Get-Date).AddDays($Daysback))}
{
Remove-Item -Recurse -Force
}
elseif ($save1)
{
Remove-Item -Recurse -Force
}
}
Something like this should work.
$Daysback = "-30"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.AddDays($Daysback)
$path = "C:\Data\*"
$Items=Get-ChildItem -Path $path -Recurse | Where-Object {($_.Name -like "Test*.zip") -and ($_.LastWriteTime -le ($DatetoDelete))}| Sort-Object LastWriteTime -Descending
$Items|Select-Object -Skip 1 |Remove-Item -Recurse -Force -Path $_.fullname
Get-ChildItem -> Filter, only get the items that name starts with Test and ends with .Zip that were written over 30 days ago. Sort them.
In the delete line, we use -Skip 1 to skip over the first item in the sorted list and remove the items by using their path.
This can be simplified. The below block will grab all files in C:\Data that meet the filter (faster than Where-Object significantly), then further reduces those based on their CreationTime, skips 1, and deletes the rest.
Get-ChildItem -Path 'C:\Data' -Filter 'Test*.zip' -Recurse |
Where-Object { -not $_.PSIsContainer -and
$_.CreationTime -le (Get-Date).AddDays(-30) } |
Sort-Object -Property 'LastWriteTime' -Descending |
Select-Object -Skip 1 |
Remove-Item -Force -WhatIf

powershell, check a backup directory and delete old ones only if there is more than one file

Hello to the whole community, I am trying to inspect directories and subdirectories of a folder and if one of them gets more than one file if it has more than 15 days to delete it and leave only the most updated.
but I still do not get the way that if I get a single file despite having more than 15 days old do not touch it as long as there is one more updated within the same directory.
I am currently working with this code
$timeLimit = (Get-Date).AddDays(-15)
Get-ChildItem D:\backup\OldFilesTemp -Directory | where LastWriteTime -lt $timeLimit | Remove-Item -Force -Recurse
grateful for the support they can give me.
You could try something like the following:
$timeLimit = (Get-Date).AddDays(-15)
Get-ChildItem D:\backup\OldFilesTemp | Where-Object { $_.PSIsContainer } | ForEach-Object { Get-ChildItem $_ | Where-Object { -not $PSIsContainer } | Sort-Object -Property LastWriteTime -Descending | Select-Object -Skip 1 | Where-Object { $_.LastWriteTime -lt $timeLimit } | Remove-Item -Force }
Replace Remove-Item -Force with Remove-Item -WhatIf to perform a dry run.
$timeLimit = ([System.DateTime]::Today).AddDays(-15) #Dont use Get-Date.
$BackupFolder = "D:\backup\OldFilesTemp"
$FolderList = Get-ChildItem $BackupFolder -Directory -Recurse | Select FullName
Foreach ($Folder in $FolderList)
{
$FileList = Get-ChildItem $Folder -File | Sort-Object -Property LastWriteTime -Descending
$Count = ($FileList | Where-Object -Property LastWriteTime -GE $timeLimit).Count
#Keep an old file if there is only 1 or no recent backups
if ($Count -le 1)
{
$FileList | Where-Object -Property LastWriteTime -LT $timeLimit | Select-Object -Skip 1 | Remove-Item -Force
}
else
{
$FileList | Where-Object -Property LastWriteTime -LT $timeLimit | Remove-Item -Force
}
}
Better do your testing before you deploy on your environment.

Using Get-childitem to get list of files -like $pattern or newer than any file -like $pattern

as for now I have script to find files -like '*ver1.0*', and it's working fine.
$files = Get-ChildItem "D:\path\" | where {($_ -like "*$version*.sql")}
List of files:
file_ver1.0_xx.sql
file_ver1.0_xy.sql
But now I need to find files which look like before OR it's newer than it.
For example, I need to find files with pattern: *ver1.0* or *ver0.9* whose LastWriteTime is newer than any *ver1.0* file.
For performance reasons I'd enumerate the files just once, determine the most recent modification date of the files with $version, then filter for files with $version and files with a different version but newer date:
$allFiles = Get-ChildItem 'D:\path' -Filter '*.sql' |
Where-Object { -not $_.PSIsContainer }
$refDate = $allFiles | Where-Object {
$_.BaseName -like "*$version*"
} | Sort-Object LastWriteTime | Select-Object -Last 1 -Expand LastWriteTime
$files = $allFiles | Where-Object {
$_.BaseName -like "*$version*" -or
($_.BaseName -notlike "*$version*" -and $_.LastWriteTime -gt $refDate)
}
If you need to actually compare version numbers you probably need a somewhat more elaborate approach, though, e.g. like this:
$pattern = 'ver(\d\.\d)'
$refVersion = [version]'1.0'
$allFiles = Get-ChildItem 'D:\path' -Filter '*.sql' |
Where-Object { -not $_.PSIsContainer }
$refDate = $allFiles | Where-Object {
$_.BaseName -match $pattern -and
[version]$matches[1] -eq $refVersion
} | Sort-Object LastWriteTime | Select-Object -Last 1 -Expand LastWriteTime
$files = $allFiles | Where-Object {
$_.BaseName -match $pattern -and
([version]$matches[1] -eq $refVersion -or
([version]$matches[1] -lt $refVersion -and $_.LastWriteTime -gt $refDate))
}
I know that's ugly, but it works.
$files = Get-ChildItem "D:\path" | where {($_ -like "*$version*.sql")}
$files += Get-ChildItem "D:\path" | where {($_ -like "*.sql" -notlike "*$version*.sql" -and $_.LastWriteTime -gt $files[0].LastWriteTime)}

Powershell delete folders

How can I delete folders, not files, using PowerShell? I want to delete folders that are over 3 days old.
Get-ChildItem "D:\test" |
Where-Object { $_.PSIsContainer -and $_.LastWriteTime -le (Get-Date).AddDays(-3) } |
ForEach-Object { Remove-Item $_ -Force }
This doesn't work. I get no error and it does not delete any folders that are within d:\test.
Try:
Get-ChildItem "D:\test" |
Where-Object { $_.PSIsContainer -and $_.LastWriteTime -le (Get-Date).AddDays(-3) } |
Remove-Item -Force
or:
Get-ChildItem "D:\test" |
Where-Object { $_.PSIsContainer -and $_.LastWriteTime -le (Get-Date).AddDays(-3) } |
ForEach-Object { Remove-Item $_.FullName -Force }
Assuming your using powershell 4+ you don't need to do all the fancy filtering, the -file switch will give you what you need.
Get-ChildItem -Path D:\test -File | Remove-Item
The above will give you what you need.