How to Get-childitem for multiple files only dated today - powershell

I'm trying to modify a PS script so that it can:
A: Check that multiple files exist and are dated today
B: Do this from multiple locations
$folder = '\\path\subfolder\'
$files = #(
"file1.txt",
"file2.txt",
"file3.txt",
)
Write-Host "Folder: $folder."
# Get only files and only their names
$folderFiles = Get-ChildItem -Path $folder -Recurse -File -Name
foreach ($f in $files) {
if ($folderFiles -contains $f) {
Write-Host "File $f was found." -foregroundcolor green
} else {
Write-Host "File $f was not found!" -foregroundcolor red
}
}
A the moment this script is designed to only look in one folder and not check for files only dated today. I have no clue how to change it to use multiple folder locations.

I guess what you are looking for is something like this:
# create an aray of paths to search through
$folders = '\\server1\share\path\subfolder\', '\\server2\share\path\subfolder\'
# create an array of file names to look for
$files = 'file1.txt', 'file2.txt', 'file3.txt'
# get the current date as of midnight
$refDate = (Get-Date).Date
# retrieve objects from recursing through the array of folders
Get-ChildItem -Path $folders -Include $files -File -Recurse |
Where-Object { $_.LastWriteTime -ge $refDate } |
# output whatever properties you want from the files
Select-Object DirectoryName, Name, LastWriteTime
Parameter -Include is only valid when the path ends with \* OR if the -Recurse switch is used

To get items from today, you could do something similar:
Get-Childitem $folder * -Recurse | Where-Object {$_.CreationTime -gt (Get-Date).Date }
As for the multiple locations - I would suggest creating an array of folders you want to search and then iterating through the array.

Related

Filter and delete files and folders(and files inside of folders) older than x days in powershell

this is my first post on this forum. Im a beginner in coding and I need help with one of my very first self coded tools.
I made a small script, which deletes files based on if they are older than date x (lastwritetime). Now to my problem: I want the script also to check for files inside of folders inside of a directory and only delete a folder afterwards if it is truly empty. I cant figure out how to solve the recursion in this problem, seems like the script deletes just the entire folder in relation to the date x. Could anyone tell me please what I missed in this code and help me to create a own recursion to solve the problem or fix the code? Thanks to you all, guys! Here is my code:
I would be glad if someone knows how to make the code work by using a function
$path = Read-Host "please enter your path"
"
"
$timedel = Read-Host "Enter days in the past (e.g -12)"
$dateedit = (Get-Date).AddDays($timedel)
"
"
Get-ChildItem $path -File -Recurse | foreach{ if ($_.LastWriteTime -and !$_.LastAccessTimeUtc -le $dateedit) {
Write-Output "older as $timedel days: ($_)" } }
"
"
pause
Get-ChildItem -Path $path -Force -Recurse | Where-Object { $_.PsisContainer -and $_.LastWriteTime -le $dateedit } | Remove-Item -Force -Recurse
""
Write-Output "Files deleted"
param(
[IO.DirectoryInfo]$targetTolder = "d:\tmp",
[DateTime]$dateTimeX = "2020-11-15 00:00:00"
)
Get-ChildItem $targetTolder -Directory -Recurse | Sort-Object {$_.FullName} -Descending | ForEach-Object {
Get-ChildItem $_ -File | Where-Object {$_.LastWriteTime -lt $dateTimeX} | Remove-Item -Force
if ((Get-ChildItem $_).Count -eq 0){Remove-Item $_ -Force}
}
remove -WhatIf after test
To also remove folders that are older than the set days in the past if they are empty leaves you with the problem that as soon as a file is removed from such a folder, the LastWriteTime of the folder is set to that moment in time.
This means you should get a list of older folders first, before you start deleting older files and use that list afterwards to also remove these folders if they are empty.
Also, a minimal check on user input from Read-Host should be done. (i.e. the path must exist and the number of days must be convertable to an integer number. For the latter I chose to simply cast it to [int] because if that fails, the code would generate an execption anyway.
Try something like
$path = Read-Host "please enter your path"
# test the user input
if (-not (Test-Path -Path $path -PathType Container)) {
Write-Error "The path $path does not exist!"
}
else {
$timedel = Read-Host "Enter days in the past (e.g -12)"
# convert to int and make sure it is a negative value
$timedel = -[Math]::Abs([int]$timedel)
$dateedit = (Get-Date).AddDays($timedel).Date # .Date sets this date to midnight (00:00:00)
# get a list of all folders (FullNames only)that have a LastWriteTime older than the set date.
# we check this list later to see if any of the folders are empty and if so, delete them.
$folders = (Get-ChildItem -Path $path -Directory -Recurse | Where-Object { $_.LastWriteTime -le $dateedit }).FullName
# get a list of files to remove
Get-ChildItem -Path $path -File -Recurse | Where-Object { $_.LastWriteTime -le $dateedit} | ForEach-Object {
Write-Host "older as $timedel days: $($_.FullName)"
$_ | Remove-Item -Force -WhatIf # see below about the -WhatIf safety switch
}
# now that old files are gone, test the folder list we got earlier and remove any if empty
$folders | ForEach-Object {
if ((Get-ChildItem -Path $_ -Force).Count -eq 0) {
Write-Host "Deleting empty folder: $_"
$_ | Remove-Item -Force -WhatIf # see below about the -WhatIf safety switch
}
}
Write-Host "All Done!" -ForegroundColor Green
}
The -WhatIf switch used on Remove-Item is there for your own safety. With that, no file or folder is actually deleted, instead in the console it is written what would be deleted. If you are satisfied that this is all good, remove the -WhatIf and run the code again to really delete the files and folders
try something like this:
$timedel=-12
#remove old files
Get-ChildItem "C:\temp" -Recurse -File | Where LastWriteTime -lt (Get-Date).AddDays($timedel) | Remove-Item -Force
#remove directory without file
Get-ChildItem "C:\temp\" -Recurse -Directory | where {(Get-ChildItem $_.FullName -Recurse -File).count -eq 0} | Remove-Item -Force -recurse

Powershell .addDays()

I'm writing a script to check if files are older than a year. I get an error Not Icomparable. I'm not sure how to go about fixing this and am stumped.
$file
$myDate = Get-Date
$path = $args[0]
$files = Get-ChildItem -Path $path -recurse
foreach($file in $files){
if($file -gt $myDate.addDays(-365)){
Write-Host "Found One"
}
}```
You need to get the files before looping over them. You also need to tell PowerShell you're looking to compare the date of the file, and which date - created, modified etc. At the moment you're saying "if this FileInfo object is less that date", which is why you're getting that error (as per mklement0's comment, FileInfo does not implement IComparable)
$files = Get-ChildItem -Path $args[0]
foreach ($file in $files) {
if( $file.LastWriteTime -lt $myDate.addDays(-365)) {
Write-Host "Found One: $($file.Name)"
}
}
Using args[0] is bad practice. Use a named parameter instead
Param (
$Path
)
$myDate = Get-Date
$files = Get-ChildItem -Path $Path
...
Documentation.
Get-ChildItem - change the parameters if you want to e.g include subdirectories.
FileInfo - this is what you can access in $files
One line, if you want to shortly get files which past current year:
gci 'd:\temp' | ? { $_.CreationTime -lt (Get-Date).addDays(-365) }

Why exclude clause not working as expected.. powershell?

I have a powershell script designed to go through some backup files, archive the month end backup(if its the first of the month) and delete anything older than 9 days after that. I'm trying to tell it to exclude the archive folder but it seems to be ignoring that and I'm unsure why.. kind of a noob to powershell.
I've tried using -notlike $Exlcude+% I've tried -notmatch -NE .. I even tried notcontains even though I knew that wouldn't work.. I tried applying the $exclude to every portion of the code where that folder might be accessed and its still making a copy of it.
#If first of the month copy latest backups to Monthly Backup Folder - will copy full folder path
PARAM($BackupPath="\\SomeServer\sqltestbackups\",
$Exclude= "\\SomeServer\sqltestbackups\Archive")
#Grab a recursive list of all subfolders
$SubFolders = dir $BackupPath -Recurse | Where-Object {$_.PSIsContainer} | ForEach-Object -Process {$_.FullName}
#Iterate through the list of subfolders and grab the first file in each
$Year = (get-date).Year
$Month = (get-date).Month
$StartOfMonth = Get-Date -Year $Year -Month $Month -Day 1
$Today = (Get-Date).day
$Date = (GET-DATE).AddDays(-9)
$PrevMonth = (GET-DATE).year.ToString()+(Get-Culture).DateTimeFormat.GetMonthName((Get-Date).AddMonths(-1).month)
$Destination=$Exclude + "\" + $Prevmonth +"\"
IF (!(Test-Path -path $destination)) {New-Item $destination -Type Directory}
IF($Today -eq '5')
{
$Path = $BackupPath #Root path to look for files
$DestinationPath = $Destination #Remote destination for file copy
#Grab a recursive list of all subfolders
$SubFolders = dir $Path -Recurse | Where-Object {$_.PSIsContainer -and $_.DirectoryName -notmatch $Exclude} | ForEach-Object -Process {$_.FullName}
#Iterate through the list of subfolders and grab the first file in each
ForEach ($Folder in $SubFolders)
{
$FullFileName = dir $Folder | Where-Object {!$_.PSIsContainer -and $_.DirectoryName -notmatch $Exclude} | Sort-Object {$_.LastWriteTime} -Descending | Select-Object -First 1
#For every file grab it's location and output the robocopy command ready for use
ForEach ($File in $FullFileName)
{
$FilePath = $File.DirectoryName
$ArchFolder = $File.DirectoryName
$ArchFolder=$ArchFolder.Replace($BackupPath,"")+"\"
$FinalPath=$destinationPath+$ArchFolder
$FileName = $File.Name
robocopy $FilePath $FinalPath $FileName /A-:SH /R:6 /W:30 /Z
}
}
}
# Delete files older than 9 days that are not contained within the month end folder
Get-ChildItem $BackupPath -Recurse |
Where-Object { !($_.PSIsContainer) -and
$_.LastWriteTime -lt $Date -and
$_.Directory -notlike $Exclude+"%" } |
Remove-Item
The code works except for the copy month end portion.. in this portion it is including the archive folder and I end up with it copying the previous month.. the script is designed to put the files in archive/YM/FullPath of backup so what is happening is its going Archive/YM/ARchive/YM/FullPath even though I'm trying SO HARD to exclude this path from the folders.
Example of whats going wrong with the robocopy
Source : \SomeServer\sqltestbackups\ARCHIVE\2019March\SomeOtherServer\SQLBackups\SomeDatabase\master\
Dest : \SomeServer\sqltestbackups\Archive\2019March\ARCHIVE\2019March\SomeOtherServer\SQLBackups\SomeDatabase\master\
the type DirectoryInfodoes not have a property DirectoryName. Try property BaseName instead.
# dir, gci are aliases for Get-ChildItem
Get-ChildItem $Path -Recurse `
| Where-Object { $_.PSIsContainer -and $_.BaseName -notmatch $Exclude }
works.
To see which types are returned and which members there are type
Get-ChildItem .\ | Where-Object { $_.PSIsContainer } | ForEach-Object { $_.GetType() }
# and
Get-ChildItem .\ | Where-Object { $_.PSIsContainer } | Get-Member
But be careful: You will get a lot of output on directories with many files and subdirectories.
Also have a look at the answer How can I exclude multiple folders using Get-ChildItem -exclude? on stackoverflow. This answer comes with lots of elegant solutions for PowerShell from v1.0 to v5.0.
problem was
$Exclude= "\\SomeServer\sqltestbackups\Archive" I needed to double up the \s to be
$Exclude= "\\\\SomeServer\\sqltestbackups\\Archive"
after doing this the script worked fine.

Issue with archiving files

I am trying to archiving old files in the server (older then 90 days) and it should be separate ZIP file for every month. I have powershell-v1.0 so I am not able to use System.Management.Automation.PSObject
I have created a script but I have a problem with zip file name. When I run the script all files are moving to one archive with name +++.
$folders ="C:\New folder\test\"
Function Zip
{
Param
(
[string]$zipFile
,
[string[]]$toBeZipped
)
$CurDir = Get-Location
Set-Location "C:\program files\7-zip\"
.\7z.exe A -tzip $zipFile $toBeZipped | Out-Null
Set-Location $CurDir
}
$files = Get-ChildItem -Path $folders | Where-Object {$_.LastwriteTime -lt ((Get-date).adddays(-90))} | % { $_.FullName};
if ( !$files)
{break}
else
{
Write-Host $files
$file_year = $files.LastwriteTime.Year
$file_month = $files.LastwriteTime.Month
echo $file_month
ZIP $folders+"$file_year"+"$file_month"+".zip" $files
If(Test-Path $folders+$file_year+$file_month+.zip)
{
Remove-Item $files
}}
It would be nice if someone can figure out what am I doing wrong.
There are two issues why this doesn't work. First, you are selecting (using the ForEach-Object cmdlet %) the FullName property thus won't be able to access the LastWriteTime property anymore. Secondly, you are trying to access the property on a potential array of files (which year and month you want to use?)
So I would change / refactor your script to something like this (untested).
$folders ="C:\New folder\test\"
function Write-ZipFile
{
Param
(
[string]$Path,
[string[]]$Files
)
$7zip = Join-Path $env:ProgramFiles '\7-zip\7z.exe'
& $7zip A -tzip $zipFile $toBeZipped | Out-Null
}
$files = Get-ChildItem -Path $folders |
Where-Object { $_.LastWriteTime -lt ((Get-date).AddDays(-90)) }
$zipFile = Join-Path $folders ('{0}{1}.zip' -f $files[0].LastwriteTime.Year, $files[0].LastwriteTime.Month)
Write-ZipFile -Path $zipFile -Files ($files | select -ExpandProperty FullName)
The list of file names to archive is in $files. Use $files.GetType() to see that it is an array. Use $files[0].GetType() to see that each element is a string type, not a file object type.
$files = Get-ChildItem -Path $folders |
Where-Object {$_.LastwriteTime -lt (Get-date).adddays(-90)}
I imagine that you will want to omit directories.
The $files array will be an array of FileInfo objects, not strings.
Secondly, do something to iterate over the list of FileInfo objects.
[cmdletbinding()]
Param()
$folders = "H:\src\powershell\afm"
Function Zip
{
Param (
[string]$zipFile
, [string[]]$toBeZipped
)
& "C:\Program Files\7-Zip\7z.exe" A -tzip $zipFile $toBeZipped | Out-Null
}
$files = Get-ChildItem -Path $folders -File |
Where-Object {($_.LastwriteTime -lt (Get-date).adddays(-10)) -and ($_.Extension -ne ".zip")}
$files | ForEach-Object {
Write-Verbose "Working on file $_"
$file_year = $_.LastwriteTime.Year
$file_month = $_.LastwriteTime.Month
Write-Verbose "file_month is $file_month"
Zip "$folders\$file_year$file_month.zip" "$folders\$_"
If (Test-Path "$folders\$file_year$file_month.zip")
{
### Remove-Item $_q
}
}
It would appear that the problem is that there is nothing to process each file to archive them. There is no ForEach-Object.
There is no LastWriteTime on an array object, only on a FileInfo object.

Need a powershell script that will moved folders and files from a location to another location

Need a powershell script that will moved folders and files from a location to another location that are older then x number of days but some folders are exempted.
Also needs to have the ability to email a list of files and folders that it moved.
I can move the files in a folder, I'm not sure how to move the entire folders though.
Here is some code I have put together so far, any suggestions would be great
Set-ExecutionPolicy RemoteSigned
#----- define parameters -----#
#----- get current date ----#
$Now = Get-Date
#----- define amount of days ----#
$Days = "7"
#----- define folder where files are located ----#
$TargetFolder = "C:\test"
$TargetPath = "C:\test5"
#----- define extension ----#
$Extension = "*.*"
#----- define LastWriteTime parameter based on $Days ---#
$LastWrite = $Now.AddDays(-$Days)
#----Exclusion List ----#
$exclude =#('test1', 'test2')
#----- get files based on lastwrite filter and specified folder ---#
$Files = Get-Childitem -path $TargetFolder -Include $Extension -Recurse | Where {$_.LastWriteTime -le "$LastWrite"} -and $_Name -ne $exclude | foreach ($_)} #-
foreach ($File in $Files)
{
if ($File -ne $NULL)
{
write-host "Deleting File $File" -ForegroundColor "DarkRed"
Move-Item $File.FullName $TargetPath -force
}
else
{
Write-Host "No more files to delete!" -foregroundcolor "Green"
}
}
A shorthand version that is supported on PowerShell v3 or higher. This would find all the folder where the LastWriteTime is older that 7 days and move them.
$LastWrite = (Get-Date).AddDays(-7)
gci c:\temp -Directory -Recurse | ?{$_.LastWriteTime -le $LastWrite} | select -expand fullname | %{Move-Item $_ $TargetPath}
There would be no point with file exclusions if you are just looking at the folder time so that logic is omitted. Same code but easier to read:
$LastWrite = (Get-Date).AddDays(-7)
Get-ChildItem $TargetFolder | Where-Object{$_.LastWriteTime -le $LastWrite} | Select-Object -ExpandProperty FullName | ForEach-Object{
Move-Item $_ $TargetPath
}
Caveat
There could be an issue where you are trying to move a folder and a parent might have been previously moved. Don't really have the test environment to check that right now. Could just use a little test before the copy just in case.
If(Test-Path $_){Move-Item $_ $TargetPath}
Email
A starting point for working with email would be Send-MailMessage. There are other methods as well.
Folder Exclusion
If you wanted to omit certain folders there is a couple of ways to accomplish that. If you know the whole folder name you want to omit you could add this $exclude =#('test1', 'test2') like you already have and change the Where clause.
Where-Object{$_.LastWriteTime -le $LastWrite -and $exclude -notcontains $_.Name}
If you didnt know the whole name and maybe that $exclude only contained the partial name you could do this as well using a little regex
$exclude =#('test1', 'test2')
$exclude = "({0})" -f ($exclude -join "|")
#..... other stuff happens
Where-Object{$_.LastWriteTime -le $LastWrite -and $_.Name -notmatch $exclude}