Powershell get-childitem excludes a range of dates - powershell

I'm using Get-ChildItem to get all of the tomcat log files where the date is not equal to the current date/today. I would like to get the tomcat log files where the date is not in a range of dates. For example the filenames of the last 7 days shall not be listed.
Sample of tomcat logs filename:
catalina.2018-12-21.log
host-manager.2018-12-21.log
$date=Get-Date (get-date).addDays(-0) -UFormat "%Y-%m-%d"
$file=Get-ChildItem "C:\tomcat\logs" -exclude "*$date*"
foreach($files in $file)
{
Move-Item -Path $files -Destination "C:\expiredlogs"
}
[.....]Get all of the logs filename where date is not in the last 7 days range from "C:\expiredlogs"
Is there any good, efficient way to retrieve all the filenames not in the range 7 days ago till now?

I'm assuming that you just want to get all files, regardless of their name. Until now you based that on the file name itself, but you could base the search on the attributes of the files. In this example I'm getting all files that is 7 days old or newer.
$files=Get-ChildItem "C:\tomcat\logs" | Where-Object LastWriteTime -gt (Get-Date).AddDays(-7).Date
foreach($file in $files)
{
Move-Item -Path $file -Destination "C:\expiredlogs"
}
The above code will only look at the write time for the file, regardless of the file name. You would limit that further if needed, by applying other filters.
Updated based on the recommendations from #LotPings

If you insist on using the file names you need to parse the names to dates, since Get-ChildItem doesn't know about dates. Something like this should do the trick:
Get-ChildItem "c:\programdata\dgs\cathi\log" | `
where { ([DateTime]::ParseExact($_.Name.Substring($_.Name.Length-14,10),'yyyy-MM-dd', $null) -lt (Get-Date).addDays(-7))}
The number 14 is not a magical number, it's the length of the date + '.log'.

Above mentioned method to use LastWriteTime is correct way to do this. But if there's time stamps in the filenames as you have, filtering could be more efficient than Where-Object and you can give it arrays.
First create an array of dates which should be excluded:
$range = -3 .. -5 | ForEach-Object { "*$(Get-Date (Get-Date).addDays($_) -UFormat '%Y-%m-%d')*" }
Which today returns:
*2018-12-18*
*2018-12-17*
*2018-12-16*
And pass that to the Get-ChildItem:
Get-ChildItem "C:\tomcat\logs" -Exclude $range

Related

How to rename folders to put the year first using Powershell

I'm trying to organize some old photos that are split into many different folders. All of the folder names do contain the year, but almost always at the end of the folder name. This doesn't work very well when I'm trying to sort all of my photos from the past 20 years. I'm trying to write a script that would loop through all of the folder names and move the year (YYYY) to the beginning of the folder name.
Current Folders:
The best trip ever 2012
Visiting relatives 2010
2017 trip
Old photos from 2001
Convert to:
2012 The best trip ever
2010 Visiting relatives
2017 trip
2001 Old photos from
I am not very familiar with powershell so I've spent a few hours fiddling with regex and the necessary scripts to filter to the right subset of folder names (that start with a letter and contain a 4 digit year) but I'm struggling to actually rename these successfully.
Here's what I have:
$folders = Get-ChildItem -Path C:\Users\User\pictures\ | Where-Object { $_.Name -match '^[a-zA-Z].+[0-9]{4}' }
foreach ($folder in $folders)
{ $folder.Name.Split() | Where {$_ -match "[0-9]{4}"}
Rename-Item -Path $folder-NewName "$($Matches[0])_$folder.Name"
}
Any help is appreciated!
If you use the -match operator with a regex that captures the name parts of interest via capture groups ((...)), you can rearrange these name parts, as reflected in the automatic $Matches variable variable, in a delay-bind script block passed to the Rename-Item call:
Get-ChildItem -Directory C:\Users\User\pictures |
Where-Object Name -match '^(.+?) ?\b(\d{4})\b(.*?)$' |
Rename-Item -WhatIf -NewName {
'{0} {1}{2}' -f $Matches.2, $Matches.1, $Matches.3
}
Note: The -WhatIf common parameter in the command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
For an explanation of the regex and the ability to interact with it, see this regex101.com page.
Note: The following simplification, which uses the -replace operator, works in principle, but, unfortunately, reports spurious Source and destination path must be different errors as of PowerShell 7.2.1, for all directories whose name either doesn't contain a 4-digit year or already has it at the start of the name:
# !! Works, but reports spurious errors as of PowerShell 7.2.1
Get-ChildItem -Directory C:\Users\User\pictures
Rename-Item -WhatIf -NewName { $_.Name -replace '^(.+?) ?\b(\d{4})\b(.*?)$' }
The reason is Rename-Item's unfortunate behavior of complaining when trying to rename a directory to its existing name (which happens when the -replace operation doesn't find a regex match and therefore returns the input string as-is), which should be a quiet no-op, as it already is for files - see GitHub issue #14903.

Powershell: Compare Last Modified to Specific Date and replace with correct Date

I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.

PowerShell find most recent file

I'm new to powershell and scripting in general. Doing lots of reading and testing and this is my first post.
Here is what I am trying to do. I have a folder that contains sub-folders for each report that runs daily. A new sub-folder is created each day.
The file names in the sub-folders are the same with only the date changing.
I want to get a specific file from yesterday's folder.
Here is what I have so far:
Get-ChildItem -filter “MBVOutputQueriesReport_C12_Custom.html” -recurse -path D:\BHM\Receive\ | where(get-date).AddDays(-1)
Both parts (before and after pipe) work. But when I combine them it fails.
What am I doing wrong?
What am I doing wrong?
0,1,2,3,4,5 | Where { $_ -gt 3 }
this will compare the incoming number from the pipeline ($_) with 3 and allow things that are greater than 3 to get past it - whenever the $_ -gt 3 test evaluates to $True.
0,1,2,3,4,5 | where { $_ }
this has nothing to compare against - in this case, it casts the value to boolean - 'truthy' or 'falsey' and will allow everything 'truthy' to get through. 0 is dropped, the rest are allowed.
Get-ChildItem | where Name -eq 'test.txt'
without the {} is a syntax where it expects Name is a property of the thing coming through the pipeline (in this case file names) and compares those against 'test.txt' and only allows file objects with that name to go through.
Get-ChildItem | where Length
In this case, the property it's looking for is Length (the file size) and there is no comparison given, so it's back to doing the "casting to true/false" thing from earlier. This will only show files with some content (non-0 length), and will drop 0 size files, for example.
ok, that brings me to your code:
Get-ChildItem | where(get-date).AddDays(-1)
With no {} and only one thing given to Where, it's expecting the parameter to be a property name, and is casting the value of that property to true/false to decide what to do. This is saying "filter where *the things in the pipeline have a property named ("09/08/2016 14:12:06" (yesterday's date with current time)) and the value of that property is 'truthy'". No files have a property called (yesterday's date), so that question reads $null for every file, and Where drops everything from the pipeline.
You can do as Jimbo answers, and filter comparing the file's write time against yesterday's date. But if you know the files and folders are named in date order, you can save -recursing through the entire folder tree and looking at everything, because you know what yesterday's file will be called.
Although you didn't say, you could do approaches either like
$yesterday = (Get-Date).AddDays(-1).ToString('MM-dd-yyyy')
Get-ChildItem "d:\receive\bhm\$yesterday\MBVOutputQueriesReport_C12_Custom.html"
# (or whatever date pattern gets you directly to that file)
or
Get-ChildItem | sort -Property CreationTime -Descending | Select -Skip 1 -First 1
to get the 'last but one' thing, ordered by reverse created date.
Read output from get-date | Get-Member -MemberType Property and then apply Where-Object docs:
Get-ChildItem -filter “MBVOutputQueriesReport_C12_Custom.html” -recurse -path D:\BHM\Receive\ | `
Where-Object {$_.LastWriteTime.Date -eq (get-date).AddDays(-1).Date}
Try:
where {$_.lastwritetime.Day -eq ((get-date).AddDays(-1)).Day}
You could pipe the results to the Sort command, and pipe that to Select to just get the first result.
Get-ChildItem -filter “MBVOutputQueriesReport_C12_Custom.html” -recurse -path D:\BHM\Receive\ | Sort LastWriteTime -Descending | Select -First 1
Can do something like this.
$time = (get-date).AddDays(-1).Day
Get-ChildItem -Filter "MBVOutputQueriesReport_C12_Custom.html" -Recurse -Path D:\BHM\Receive\ | Where-Object { $_.LastWriteTime.Day -eq $time }

Order of numbered files by get-childitem

I need to change the order by which get-childitem returns the filenames in a directory because it is important to process them in a specific way. The filenames have the following format: f_1.csv, f_2.csv, f_3.csv, f_4.csv, ... , f_10.csv, f_11.csv e.t.c.
The default order that are returned by get-childitem are: f_1.csv, f_10.csv, f_100.csv.
One solution would be to change the filenames to f_001.csv, f_002.csv, f_003.csv but I have no control on how the files were created and I do not know their number (may be hundreds, thousands e.t.c.)
My current code is:
foreach($file in Get-ChildItem $path -Filter f_*.csv)
{
#process the files here
}
Thank you
Something like this?
Get-ChildItem -Filter f_*.csv |
sort #{Expression={[int]($_.name -replace 'f_(\d+).+','$1')};Descending=$false}

Using Get-childitem to get a list of files modified in the last 3 days

Code as it is at the moment
get-childitem c:\pstbak\*.* -include *.pst | Where-Object { $_.LastWriteTime -lt (get-date).AddDays(-3)} |
Essentially what I am trying to do is get a list of all PST files in the folder above based on them being newer than 3 days old. I'd then like to count the results. The above code doesn't error but brings back zero results (there are definitely PST files in the folder that are newer than three days. Anyone have any idea?
Try this:
(Get-ChildItem -Path c:\pstbak\*.* -Filter *.pst | ? {
$_.LastWriteTime -gt (Get-Date).AddDays(-3)
}).Count
Very similar to previous responses, but the is from the current directory, looks at any file and only for ones that are 4 days old. This is what I needed for my research and the above answers were all very helpful. Thanks.
Get-ChildItem -Path . -Recurse| ? {$_.LastWriteTime -gt (Get-Date).AddDays(-4)}
Here's a minor update to the solution provided by Dave Sexton. Many times you need multiple filters. The Filter parameter can only take a single string whereas the -Include parameter can take a string array. if you have a large file tree it also makes sense to only get the date to compare with once, not for each file. Here's my updated version:
$compareDate = (Get-Date).AddDays(-3)
#(Get-ChildItem -Path c:\pstbak\*.* -Filter '*.pst','*.mdb' -Recurse | Where-Object { $_.LastWriteTime -gt $compareDate}).Count
I wanted to just add this as a comment to the previous answer, but I can't. I tried Dave Sexton's answer but had problems if the count was 1. This forces an array even if one object is returned.
([System.Object[]](gci c:\pstback\ -Filter *.pst |
? { $_.LastWriteTime -gt (Get-Date).AddDays(-3)})).Count
It still doesn't return zero if empty, but testing '-lt 1' works.
By the title of the question the answer to find every file modified today is:
Get-ChildItem -Path .*.pst| ? {$_.LastWriteTime -gt (Get-Date).AddDays(-1)}
Modified in the last three days:
Get-ChildItem -Path .*.pst| ? {$_.LastWriteTime -gt (Get-Date).AddDays(-3)}
Modified in the last three days and count:
(Get-ChildItem -Path .*.pst| ? {$_.LastWriteTime -gt (Get-Date).AddDays(-3)}).count