I'm trying to find files from the previous day to copy, but my simple get-childitem isn't working. It works with every other switch except -eq. Any suggestions to list files only from the previous day ?
get-childitem c:\users| where-object {$_.LastWriteTime -eq (get-date).adddays(-2)}
You are looking for files that are written at exact time ( hours mins, secs, year, month and 2 days before). Unless the files were written to the second, two (or one) days ago, you will not find them. In other words, you were comparing full DateTime objects and not just a date and hence there is very less chance that they were exactly equal, which seemed to suggest that -eq doesn't work, but other comparisons do.
You probably wanted to just compare the dates, without the times:
$yesterday = (get-date).AddDays(-1).Date
gci c:\users | ?{ $_.LastWriteTime.Date -eq $yesterday}
( also moved the get-date outside, as you probably don't want to do it again and again.)
They are not equal because they differ in time. If you want an exact date match, use the Date property:
get-childitem c:\users| where-object {$_.LastWriteTime.Date -eq (get-date).adddays(-2).Date}
Related
I'm trying to organize some old photos that are split into many different folders. All of the folder names do contain the year, but almost always at the end of the folder name. This doesn't work very well when I'm trying to sort all of my photos from the past 20 years. I'm trying to write a script that would loop through all of the folder names and move the year (YYYY) to the beginning of the folder name.
Current Folders:
The best trip ever 2012
Visiting relatives 2010
2017 trip
Old photos from 2001
Convert to:
2012 The best trip ever
2010 Visiting relatives
2017 trip
2001 Old photos from
I am not very familiar with powershell so I've spent a few hours fiddling with regex and the necessary scripts to filter to the right subset of folder names (that start with a letter and contain a 4 digit year) but I'm struggling to actually rename these successfully.
Here's what I have:
$folders = Get-ChildItem -Path C:\Users\User\pictures\ | Where-Object { $_.Name -match '^[a-zA-Z].+[0-9]{4}' }
foreach ($folder in $folders)
{ $folder.Name.Split() | Where {$_ -match "[0-9]{4}"}
Rename-Item -Path $folder-NewName "$($Matches[0])_$folder.Name"
}
Any help is appreciated!
If you use the -match operator with a regex that captures the name parts of interest via capture groups ((...)), you can rearrange these name parts, as reflected in the automatic $Matches variable variable, in a delay-bind script block passed to the Rename-Item call:
Get-ChildItem -Directory C:\Users\User\pictures |
Where-Object Name -match '^(.+?) ?\b(\d{4})\b(.*?)$' |
Rename-Item -WhatIf -NewName {
'{0} {1}{2}' -f $Matches.2, $Matches.1, $Matches.3
}
Note: The -WhatIf common parameter in the command above previews the operation. Remove -WhatIf once you're sure the operation will do what you want.
For an explanation of the regex and the ability to interact with it, see this regex101.com page.
Note: The following simplification, which uses the -replace operator, works in principle, but, unfortunately, reports spurious Source and destination path must be different errors as of PowerShell 7.2.1, for all directories whose name either doesn't contain a 4-digit year or already has it at the start of the name:
# !! Works, but reports spurious errors as of PowerShell 7.2.1
Get-ChildItem -Directory C:\Users\User\pictures
Rename-Item -WhatIf -NewName { $_.Name -replace '^(.+?) ?\b(\d{4})\b(.*?)$' }
The reason is Rename-Item's unfortunate behavior of complaining when trying to rename a directory to its existing name (which happens when the -replace operation doesn't find a regex match and therefore returns the input string as-is), which should be a quiet no-op, as it already is for files - see GitHub issue #14903.
I'm using Get-ChildItem to get all of the tomcat log files where the date is not equal to the current date/today. I would like to get the tomcat log files where the date is not in a range of dates. For example the filenames of the last 7 days shall not be listed.
Sample of tomcat logs filename:
catalina.2018-12-21.log
host-manager.2018-12-21.log
$date=Get-Date (get-date).addDays(-0) -UFormat "%Y-%m-%d"
$file=Get-ChildItem "C:\tomcat\logs" -exclude "*$date*"
foreach($files in $file)
{
Move-Item -Path $files -Destination "C:\expiredlogs"
}
[.....]Get all of the logs filename where date is not in the last 7 days range from "C:\expiredlogs"
Is there any good, efficient way to retrieve all the filenames not in the range 7 days ago till now?
I'm assuming that you just want to get all files, regardless of their name. Until now you based that on the file name itself, but you could base the search on the attributes of the files. In this example I'm getting all files that is 7 days old or newer.
$files=Get-ChildItem "C:\tomcat\logs" | Where-Object LastWriteTime -gt (Get-Date).AddDays(-7).Date
foreach($file in $files)
{
Move-Item -Path $file -Destination "C:\expiredlogs"
}
The above code will only look at the write time for the file, regardless of the file name. You would limit that further if needed, by applying other filters.
Updated based on the recommendations from #LotPings
If you insist on using the file names you need to parse the names to dates, since Get-ChildItem doesn't know about dates. Something like this should do the trick:
Get-ChildItem "c:\programdata\dgs\cathi\log" | `
where { ([DateTime]::ParseExact($_.Name.Substring($_.Name.Length-14,10),'yyyy-MM-dd', $null) -lt (Get-Date).addDays(-7))}
The number 14 is not a magical number, it's the length of the date + '.log'.
Above mentioned method to use LastWriteTime is correct way to do this. But if there's time stamps in the filenames as you have, filtering could be more efficient than Where-Object and you can give it arrays.
First create an array of dates which should be excluded:
$range = -3 .. -5 | ForEach-Object { "*$(Get-Date (Get-Date).addDays($_) -UFormat '%Y-%m-%d')*" }
Which today returns:
*2018-12-18*
*2018-12-17*
*2018-12-16*
And pass that to the Get-ChildItem:
Get-ChildItem "C:\tomcat\logs" -Exclude $range
I am in need of a little help as to increasing the randomness of get-random thus not repeating so often.
I have been using this with powershell:
$formats =
#("*.mpg")
$dir = Split-Path $MyInvocation.MyCommand.Path
gci "$dir\*" -include $formats | Get-Random -Count 1 |
Invoke-Item
My application is to have a in home virtual television channel that randomly selects 1 file from multiple folders at 6am and plays them all day until midnight, powers off and starts up, repeating the process daily. The problem I am finding is that whatever the get-random command uses had a tendency to choose the exact same file often. Thus after months of running this script, I am seeing the exact same movies chosen day after day and some that are never chosen. I'm guessing because the get-random is using the clock as it's factor for choosing a number?
Is there a way to increase the odds of getting a broader selection of files .mpg's in this instance and less of a repeat of the same .mpg's chosen?
My other option was to find a script that would keep track of the .mpg's chosen and "mark" them, or sort by date accessed and not play the same file twice, until all files of a particular folder have been played once; then "unmarking" all the files and starting over. To me that sounds like advanced scripting that I just don't have the knowledge to procure on my own.
Any insight into my dilemma would be vastly appreciated. Any questions about my specifics to help you ascertain a conclusion to this query will be forthcoming.
**maybe i want to know how to increase the random distribution? I'm looking for a way to have more variety in the files chosen day after day and less repeats.
that (access time) should be fairly easy to check
$item = gci "$dir\*" -include $formats | where {$_.lastaccesstime -le (Get-Date).AddDays(-30)} |
Get-Random -Count 1
$item.LastAccessTime = Get-Date
$item | Invoke-Item
I have a file share that takes hours to scan. Recently I have been asked to make a script that can send a mail when over 100 files has been changed within one minute. How can I go about his? Could be powershell, could be anything that would enable such a "scan".
If you want this to run every 5 minutes, you could retrieve all files with a LastWriteTime property value less than 5 minutes old, then group the files by the minute:
$Threshold = (Get-Date).AddMinutes(-5)
$FilesPerMinute = Get-ChildItem C:\folder\share -Recurse |Where-Object {
$_.LastWriteTime -ge $Threshold
} |Group-Object { $_.LastWriteTime.Minute } -NoElement
if($FilesPerMinute |Where {$_.Count -ge 100}){
# alert
}
You might find that a FileSystemWatcher is a better option in your scenario though
Every hour data comes into every folder of my dir tree. I need to check if it does come in every hour, or of there was any interruption. (For example, no data coming in for 2–3 hours.)
I am trying to write a PowerShell script that will check LastWriteTime for every folder, but it would solve the problem with gaps. If I would check the logs in the morning I would see that all is OK if some data come to folder one hour ago, but was not there a few hours earlier.
So IMHO LastWriteTime is not suitable for it.
Also there is a problem with subfolders. I need to check only the last folder in every dir tree. I do not know how to drop any previous folders like:
Z:\temp #need to drop
Z:\temp\foo #need to drop
Z:\temp\foo\1 #need to check
I had tried to write a script that checks the LastAccessTime, but it throws an error:
Expressions are only allowed as the first element of a pipeline.
The script is as follows:
$rootdir = "Z:\SatData\SatImages\nonprojected\"
$timedec1 = (Get-date).AddHours(-1)
$timedec2 = (Get-date).AddHours(-2)
$timedec3 = (Get-date).AddHours(-3)
$timedec4 = (Get-date).AddHours(-4)
$dir1 = get-childitem $rootdir –recurse | ?{ $_.PSIsContainer } | Select-Object FullName | where-object {$_.lastwritetime -lt $timedec1} | $_.LastWriteTime -gt $timedec4 -and $_.LastWriteTime -lt $timedec3
$dir1
But as I said before it does not solve the problem.
--
The main question exactly about checking of continuously data collections. I would make dir tree bu hands, but I need to way to check if data had come to folder every hour or there was any hours without data...
you can try to setup the powershell script to run in a Windows Scheduler (which will run every hour). This way, the script will only have to check if any data arrived within the past one hour.