I'm writing a script that I can run once every two weeks to clear out folders and files that haven't been accessed in two weeks or longer. I've written most of the script, and it works well until I add the following line of code:
Where-Object { $_.LastAccessTime –lt $RefDate } |
For some reason, this code prevents $condition from having an output [see code below].
I use $condition later in a do-while to recursively delete folders, but it won't loop due to this, or export the data to a CSV folder anymore. [Removing this line lets it work again]
Here's the key sections that the line is used in/relevant to it:
$dPath = "C:\Users\my.name\Desktop\PowTest2\*\"
$RefDate = (Get-Date).AddHours(-1);
$condition = Get-ChildItem $dPath -recurse -force |
Where-Object { $_.LastAccessTime –lt $RefDate } |
Where-Object {$_.PSIsContainer -eq $True} |
Where-Object{$_.GetFileSystemInfos().Count -eq 0} |
select FullName
write-output $RefDate;
write-output $condition
Above, $RefDate outputs as expected, $condition outputs nothing unless I remove the problematic line of code.
Edit:
Hi all,
Olaf made a good point, and asked me to check if the property is tracked for the folder. It appears it isn't, which would explain my issue. I'll update after more research and testing.
Not sure if its been answered, but i was doing some reasearch and found a 7 month old post where it was answered, i changed it around to match what youd like "It deletes all files older than 13 days, so not sure if that works lol". Let me know if this help
OG POST: Powershell delete folder files based on date range criteria
$path = "C:\Users\XXXXXXXX\Desktop\test"
foreach($file in (GEt-childitem -Path $path -Recurse | Where-Object {($_.LastWriteTime
-lt (Get-Date).AddDays(-14))} ))
{
if (($file.lastwritetime.Date.Day -in 1,15,30 ) -or ($file -like '*weekly*'))
{continue}
Remove-Item -Path $file.FullName}
Related
Ive been running around like crazy lately with this script that Im trying to modify it to suit my needs. I recently found out that deleting the files using "LastWriteTime" is not what Im after..
What I need my script to do is to delete the files that are older than 30 days using the "CreationTime" property, the problem is that after I modify the script to use this it deletes the entire folder structure?
How can this small modification change the behavior of the entire script?
This is what Im using:
$limit = (Get-Date).AddDays(-30)
$del30 = "D:\CompanyX_ftp\users"
$ignore = Get-Content "C:\Users\UserX\Documents\Scripts\ignorelist.txt"
Get-ChildItem $del30 -Recurse |
Where-Object {$_.CreationTime -lt $limit } |
Select-Object -ExpandProperty FullName |
Select-String -SimpleMatch -Pattern $ignore -NotMatch |
Select-Object -ExpandProperty Line |
Remove-Item -Recurse
So if I were to replace the "CreationTime" property with "LastWriteTime" the script will run and do what its supposed to but if I use "CreationTime" it just deletes everything under the folder structure including the folders themselves and the paths that its supposed to ignore.
UPDATE: The script is working for me now for the actual deletion of the files but for the script that Im using to just get a report on the actual files that the script will delete is actually including the paths of the ignorelist.txt file?
Please see below script:
$limit = (Get-Date).AddDays(-30)
$del30 = "D:\CompanyX_ftp\users"
#Specify path for ignore-list
$ignore = Get-Content "C:\Users\UserX\Documents\Scripts\ignorelist.txt"
Get-ChildItem $del5 -File -Recurse |
Where-Object {$_.CreationTime -lt $limit } |
Select-Object -ExpandProperty FullName |
Select-String -SimpleMatch -Pattern $ignore -NotMatch |
Select-Object -ExpandProperty Line |
Get-ChildItem -Recurse | Select-Object FullName,CreationTime
ignorelist.txt sample data:
D:\CompanyX_ftp\users\ftp-customerA\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerB\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerC\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerD\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerE\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerF\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerG\Customer Downloads
D:\CompanyX_ftp\users\ftp-customerH\Customer Downloads\
Any ideas on why its including the paths that I have mentioned on the ignorelist.txt? (I will also provide an image for better illustration).
Thanks in advance for any help or guidance with this.
//Lennart
I see two problems with the updated code:
Duplicate recursion. First Get-ChildItem iterates over contents of directory recursively. Later in the pipeline another recursive iteration starts on items returned by the first Get-ChildItem, causing overlap.
When filtering by $ignore, only paths that exactly match against the $ignore paths are being ignored. Paths that are children of items in the ignore list are not ignored.
Here is how I would do this. Create a function Test-IgnoreFile that matches given path against an ignore list, checking if the current path starts with any path in the ignore list. This way child paths are ignored too. This enables us to greatly simplify the pipeline.
Param(
[switch] $ReportOnly
)
# Returns $true if $File.Fullname starts with any path in $Ignore (case-insensitive)
Function Test-IgnoreFile( $File, $Ignore ) {
foreach( $i in $Ignore ) {
if( $File.FullName.StartsWith( $i, [StringComparison]::OrdinalIgnoreCase ) ) {
return $true
}
}
$false
}
$limit = (Get-Date).AddDays(-30)
$del30 = "D:\CompanyX_ftp\users"
$ignore = Get-Content "C:\Users\UserX\Documents\Scripts\ignorelist.txt"
Get-ChildItem $del30 -File -Recurse |
Where-Object { $_.CreationTime -lt $limit -and -not ( Test-IgnoreFile $_ $ignore ) } |
ForEach-Object {
if( $ReportOnly) {
$_ | Select-Object FullName, CreationTime
}
else {
$_ | Remove-Item -Force
}
}
So, I have a directory of pictures for a website. There are pictures that are the same with both a .jpeg extension and a .webp extension. I want to write a PowerShell script that finds all the existing .jpeg files that changed in the last 24 hours, and then find the respective .webp file and delete the .webp file.
I've tried this to get all the .webp files that can be deleted but it doesn't seem to work:
$images = Get-ChildItem -Path $dir\*.jpg, $dir\*.webp |
Group-Object { $_.BaseName } |
Where-Object {($_.CreationTime -gt (Get-Date).AddDays(-1)) -or ($_.Group.Extension -notcontains '.jpg')} |
ForEach-Object Group
I think this is easier:
Get a list of basenames for the jpg files that were last modified as of yesterday, next get a list of files in the same directory with the .webp extension that have a BaseName matching one of the jpg basenames and then remove these.
$dir = 'D:\Test'
$refdate = (Get-Date).AddDays(-1).Date
$jpegs = (Get-ChildItem -Path $dir -Filter '*.jpg' -File | Where-Object { $_.LastWriteTime -gt $refdate }).BaseName
Get-ChildItem -Path $dir -Filter '*.webp' -File | Where-Object { $jpegs -contains $_.BaseName } | Remove-Item -WhatIf
Onece you are satisfied the correct files are getting deleted, remove the safety -WhatIf switch and run again.
Theo's helpful answer shows you an alternative approach; if you want to stick with the Group-Object approach:
$webpFilesToDelete =
Get-ChildItem -Path $dir\*.jpg, $dir\*.webp |
Group-Object BaseName | Where-Object Count -eq 2 |
ForEach-Object {
# Test the creation date of the *.jpg file.
if ($_.Group[0].CreationTime -gt (Get-Date).AddDays(-1)) {
$_.Group[1] # Output the corresponding *.webp file
}
}
Note that, as in your own attempt, .CreationTime is used, though note if there's a chance that the files are updated again after creation and that is the timestamp you care about, you should use .LastWriteTime.
The command relies on the fact that Group-Object sorts the elements of the groups it creates by the sort criteria, which in this case, due to grouping by a string property - means lexical sorting in which .jpg files are listed before .webp files.
Therefore, for groups that have 2 elements (Where-Object Count -eq 2), implying that for the given base name both a .jpg and a .webp file exist, $_.Group[0] refers to the .jpg file, and $_.Group[1] to the .webp file.
As for what you tried:
$_.CreationTime yields $null, because $_ in your command refers to the group-information object at hand (an instance of Microsoft.PowerShell.Commands.GroupInfo), as output by Group-Object, and this type has no such property.
Also, since you're using Where-Object, you're simply filtering groups, so that any group that passes the filter tests is passed through as-is, and ForEach-Object Group then outputs both files in the group.
Thanks for your helpfull answers and the clarification! I thougth initially that I needed .LastWriteTime, but I needed .CreationTime. This serves my needes. Here is the result:
$refdate = (Get-Date).AddDays(-1).Date
$jpegs = (Get-ChildItem -Path $dir -Filter '*.jpg' -File | Where-Object { $_.CreationTime -gt $refdate }).BaseName
$webp = Get-ChildItem -Path $dir -Filter '*.webp' -File | Where-Object { $jpegs -contains $_.BaseName }
Current situation
I've been working on a script that should copy .txt files containing specific words and with an age of 7 days maximum into a folder, once. So far I've only been able to get code to copy files to the destination folder if the file doesn't exist already.
Code
$path = "c:\PS1\*.txt"
$Destination = "c:\PS2\"
$filter = "thisisatest"
$logfile = "C:\PS2\testlog_$(get-date -format `"MMyyyy`").txt"
#Picking files with certain words and modified within the last 7 days
Foreach($file in (Get-ChildItem $path | Where-Object {$_.name -Match $filter}))
{
If($file.LastWriteTime -gt (Get-Date).adddays(-7).date)
#Logging
{
function log($string, $color)
{
if ($Color -eq $null) {$color = "white"}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
#Comparing folder content and copying those not present in destination folder
Compare-Object $path $Destination -Property Name, Length | Where-Object {$_.SideIndicator -eq "=>"} | ForEach-Object {Copy-Item -Path $file.fullname -Destination $Destination -Force}
}
log $file.fullname
}
In conclusion
I have tried finding code which would make it possible to do the following:
Compare path folder content to .txt log for reoccurring names and only copy those not present in list, if a file name is present in the list, move on to next file to copy.
Log only files that have just been copied in a .txt file, if log doesn't exist, create.
Delete log content older than 30 days
Some of my code is probably obsolete or lacking parts, it is made up from bits and pieces I have found while looking for examples.
I know most of it is probably doable with Robocopy, but I hope it can be done in powershell,
Hope you can help me
Ok, moved parameters inside function, made the $string parameter mandatory, gave the option of specifying a logfile (it will default to the global variable $logfile, and gave the color validation so people can IntelliSense it or tab complete it. I also moved it to the beginning of the script, since that's where you usually find functions and it just made more sense to me. Also made sure that the log file isn't a folder, and that it has an extension (it adds .log if it doesn't and isn't an existing file that you're just adding to). I think I may have overdone it on the function. It's very functional, and versatile in case it's needed for another script, but its kind of overkill.
Then I looked at how files were being selected, and then filtered, and revamped things a bit. Now the Get-ChildItem (alias GCI) filters the file name against $Filter, the last write time against Get-Date -7 days, it makes sure the FullName is not in the log file, and it makes sure that it's a file, not a folder.
I also added a line to clean up old logs.
function log{
Param(
[string]$string=$(Throw 'You must provide a string to enter into the log'),
$LogFile = $Global:LogFile,
[ConsoleColor]
$color = "white")
If((gci $logfile -ea SilentlyContinue).psiscontainer){"Specified log file is a folder, please specify a file name.";break}
If($logfile -notmatch "\.[^\\]*$" -and !(test-path $LogFile)){$LogFile = $LogFile + ".log"}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
$path = "c:\PS1\*.txt"
$Destination = "c:\PS2\"
$filter = "thisisatest"
$logfile = $Destination+"testlog_$(get-date -format `"MMyyyy`").txt"
gci $Destination -filter "testlog_*.txt" |?{$_.LastWriteTime -le (get-date).AddDays(-42)}| Remove-Item -Force
$AllLogs = gc "$($Destination)testlog_*.txt"
#Picking files with certain words, was modified within the last 7 days, and not already in a logfile
Foreach($file in (Get-ChildItem $path | Where-Object {!$_.PSIsContainer -and $_.LastWriteTime -gt (Get-Date).adddays(-7) -and $AllLogs -notcontains $_.FullName -and $_.Name -inotcontains $filter}))
{
$File|?{!(Test-Path $Destination+$File.Name)}|Copy-Item -Destination $Destination
log $file.fullname
}
I'm writing a PowerShell script that deletes all but the X most recent folders, excluding a folder named Data. My statement to gather the folders to delete looks like this:
$folders1 = Get-ChildItem $parentFolderName |
? { $_.PSIsContainer -and $_.Name -ne "Data" } |
sort CreationTime -desc |
select -Skip $numberOfFoldersToKeep
foreach ($objItem in $folders1) {
Write-Host $webServerLocation\$objItem
Remove-Item -Recurse -Force $parentFolderName\$objItem -WhatIf
}
This works great when I pass a $numberOfFoldersToKeep that is fewer than the number of folders in the starting directory $parentFolderName. For example, with 5 subdirectories in my target folder, this works as expected:
myScript.ps1 C:\StartingFolder 3
But if I were to pass a high number of folders to skip, my statement seems to return the value of $parentFolderName itself! So this won't work:
myScript.ps1 C:\StartingFolder 15
Because the skip variable exceeds the number of items in the Get-ChildItem collection, the script tries to delete C:\StartingFolder\ which was not what I expected at all.
What am I doing wrong?
try this:
$folders1 = Get-ChildItem $parentFolderName |
? { $_.PSIsContainer -and $_.Name -ne "Data" } |
sort CreationTime -desc |
select -Skip $numberOfFoldersToKeep
if ($folder1 -neq $null)
{
foreach ($objItem in $folders1) {
Write-Host $($objItem.fullname)
Remove-Item -Recurse -Force $objItem.fullname -WhatIf
}
}
I gave #C.B. credit for the answer, but there's another way to solve the problem, by forcing the output of Get-ChildItem to an array using the #( ... ) syntax.
$folders1 = #(Get-ChildItem $parentFolderName |
? { $_.PSIsContainer -and $_.Name -ne "Data" } |
sort CreationTime -desc |
select -Skip $numberOfFoldersToKeep)
foreach ($objItem in $folders1) {
Write-Host $webServerLocation\$objItem
Remove-Item -Recurse -Force $parentFolderName\$objItem -WhatIf
}
This returns an array of length zero, so the body of the foreach statement is not executed.
As C.B. noted in the comments above, the problem is that if you pass a null collection into a foreach statement in PowerShell, the body of the foreach statement is executed once.
This was completely unintuitive to me, coming from a .NET background. Apparently, it's unintuitive to lots of other folks as well, since there's bug reports filed for this behavior on MSDN: https://connect.microsoft.com/feedback/ViewFeedback.aspx?FeedbackID=281908&SiteID=99
Apparently, this bug has been fixed in PowerShell V3.
I am working in a windows environment.
I have a project that requires a short script to determine if a file with a modified date of today exists in a folder. If the file exists, it should copy it, if a file does not exist, it should return an error code.
I prefer to not use 3rd party apps. I am considering powershell.
I can pull a list to visually determine if the file exists, but I am having trouble batching to return an error if the count is zero.
Get-ChildItem -Path C:\temp\ftp\archive -Recurse | Where-Object { $_.lastwritetime.month -eq 3 -AND $_.lastwritetime.year -eq 2013 -AND $_.lastwritetime.day -eq 21}
Any help is greatly appreciated!
You can compare the current date against the date part only of each file LastWriteTime short date:
Get-ChildItem -Path C:\temp\ftp\archive -Recurse | Where-Object {
$_.LastWriteTime.ToShortDateString() -eq (Get-Date).ToShortDateString()
}
Get-ChildItem $path -r | % {if((!($_.psiscontianer))-and(Get-Date $_.LastWriteTime -Uformat %D)-eq(Get-Date -UFormat %D)){$_.FullName}else{Write-Warning 'No from Today'}}
F.Y.I. when doing large jobs, like if you'll be going through TB of files, use a foreach-object. It's faster then Where-Object. This method processes the objects collected in the array directly when available and doesn't wait until all objects are collected.
In summary, there always a lot of different ways to achieve the same result in PowerShell. I advocate using what is easiest for you to remember. At the same time, PowerShell can provide some big performance differences between the approaches – and it pays to know more!
You can still make the line a little more efficient by calculating the date
$date = (Get-Date -UFormat %D)
Get-ChildItem $path -r | % {if((!($_.psiscontianer))-and(Get-Date $_.LastWriteTime -Uformat %D)-eq$date){$_.FullName}else{Write-Warning 'No from Today'}}
I was able to use the following script:
$Date = Get-Date
$Date = $Date.adddays(-1)
$Date2Str = $Date.ToString("yyyMMdd")
$Files = gci "C:\\Temp\\FTP\\Archive"
ForEach ($File in $Files){
$FileDate = $File.LastWriteTime
$CTDate2Str = $FileDate.ToString("yyyyMMdd")
if ($CTDate2Str -eq $Date2Str) {Copy-Item $File.Fullname "C:\\Temp\\FTP"; exit}
}
Throw "No file was found to process"
To test if there are no files:
$out = Get-ChildItem -Path C:\temp\ftp\archive -Recurse | Where-Object {
$_.LastWriteTime.ToShortDateString() -eq (Get-Date).ToShortDateString()
};
if ($out.Count -gt 0)
//do something with your output
else
//sorry no file