I have a file share that takes hours to scan. Recently I have been asked to make a script that can send a mail when over 100 files has been changed within one minute. How can I go about his? Could be powershell, could be anything that would enable such a "scan".
If you want this to run every 5 minutes, you could retrieve all files with a LastWriteTime property value less than 5 minutes old, then group the files by the minute:
$Threshold = (Get-Date).AddMinutes(-5)
$FilesPerMinute = Get-ChildItem C:\folder\share -Recurse |Where-Object {
$_.LastWriteTime -ge $Threshold
} |Group-Object { $_.LastWriteTime.Minute } -NoElement
if($FilesPerMinute |Where {$_.Count -ge 100}){
# alert
}
You might find that a FileSystemWatcher is a better option in your scenario though
Related
I had an excel script to search for files in a command.
I found this example on the forum, the statement says that to search for a file by name, you need to write down the name and send (*) but when requested, it does not find anything
Get-ChildItem -Path "C:\\Folder\\test\*"
What can I do to simplify the code and make it much faster. Wait 10 minutes to find a file out of 10000. this is very long
I have a folder with 10,000 files, and excel searches through VBA for a script in almost 2-3 seconds.
When to script in PowerShell via
$find = Get-ChildItem -Path "C:\\Folder"
for ($f=0; $f -lt $find.Count; $f++){
$path_name = $find\[$f\].Name
if($path_name-eq 'test'){
Write Host 'success'
}
}
ut it turns out sooooooo long, the script hangs for 10 minutes and does not respond, and maybe he will be lucky to answer.
How can I find a file by filter using
Get-ChildItem
To make your search faster you can use Get-ChildItem filter.
$fileName = "test.txt"
$filter = "*.txt"
$status = Get-ChildItem -Path "C:\PS\" -Recurse -Filter $filter | Where-Object {$_.Name -match $fileName}
if ($status) {
Write-Host "$($status.Name) is found"
} else {
Write-Host "No such file is available"
}
You could also compare the speed of searching by using Measure-Command
If the disk the data is on is slow then it'll be slow no matter what you do.
If the folder is full of files then it'll also be slow depending on the amount of RAM in the system.
Less files per folder equals more performance so try to split them up into several folders if possible.
Doing that may also mean you can run several Get-ChildItems at once (disk permitting) using PSJobs.
Using several loops to take take care of a related problem usually makes the whole thing run "number of loops" times as long. That's what Where-Object is for (in addition to the -Filter, -Include and -Exclude flags to Get-ChildItem`).
Console I/O takes A LOT of time. Do NOT output ANYTHING unless you have to, especially not inside loops (or cmdlets that act like loops).
For example, including basic statistics:
$startTime = Get-Date
$FileList = Get-ChildItem -Path "C:\Folder" -File -Filter 'test'
$EndTime = Get-Date
$FileList
$AfterOutputTime = Get-Date
'Seconds taken for listing:'
(EndTime - $startTime).TotalSeconds
'Seconds taken including output:'
($AfterOutputTime - $StartTime).TotalSeconds
In one of my scripts it is generation nearly 100 of lines, while I only need the ones within 15 minutes from running the script.
I did find a script How to search a pattern in last 10 minutes of log using a powershell script
I changed it and got this:
Get-Content .\Downloads\data.txt |
ForEach-Object {$threshold = (Get-Date).AddMinutes(-130).ToString("yyyy-MM-ddTHH:mm:ssZ")}{
if($_ -match "^(?<timestamp>(\d{4}-){2}\d{2}T(\d{2}:){2}\d{2})Z.*$")
{
if((Get-Date $Matches.timestamp).ToString("yyyy-MM-ddTHH:mm:ssZ") -gt $threshold)
{
$_
}
}
}
Where i am able to only show the times withing these 15 minutes.
However, as you may see on the script pasted above, the time format in my csv file is not in the correct format. The format they used on the linked page is "dd/MM/yyyy hh:mm:ss", my time format is "yyyy-MM-ddThh:mm:ssZ" here is an example: "2020-06-04T11:39:01Z"
I have changed the "Get-Date" to show in the correct format, but what im struggling with is the 3rd line.
if($_ -match "^(?<timestamp>(\d{4}-){2}\d{2}T(\d{2}:){2}\d{2})Z.*$")
Im not really sure how to go around this, i have tried movingg around the code and more.
Some help would be appreciated and if you know a better way to to this, let me know.
Found out how i could do it!
$WebResponse = Invoke-WebRequest "<link>" | ConvertFrom-Json | Select-Object -expandproperty Links | Sort-Object -property TimestampUtcFromDevice -Descending
$TimeSpan = New-TimeSpan -start (Get-Date).AddMinutes(-15) -end ($WebResponse | Select-Object -expandproperty TimestampUtcFromDevice -first 1)
$WebResponse | Select-Object -First ($Timespan.Minutes) | export-Csv "$FilePath\$FQFL" -NoTypeInformation -delimiter "$delimiter"
Pain in the ass but i got it!
The first takes the json from the website and expands a property and sort it.
Then i find how many minutes there are from the latest code and 15 minutes back in time. A great thing i found out from one of the responses was that Get-Date knows the time zone.
The last one selects only the first entries based on the minutes from the last cmdlet, and exports it in a csv file.
I am trying to compose a script/one liner, which will find files which have been modified over 10 hours ago in a specific folder and if there are no files I need it to print some value or string.
Get-ChildItem -Path C:\blaa\*.* | where {$_.Lastwritetime -lt (date).addhours(-10)}) | Format-table Name,LastWriteTime -HideTableHeaders"
With that one liner I am getting the wanted result when there are files with
modify time over 10 hours, but I also need it to print value/string if there are
no results, so that I can monitor it properly.
The reason for this is to utilize the script/one liner for monitoring purposes.
Those cmdlet Get-ChildItem and where clause you have a would return null if nothing was found. You would have to account for that separately. I would also caution the use of Format-Table for output unless you are just using it for screen reading. If you wanted a "one-liner" you would could this. All PowerShell code can be a one liner if you want it to be.
$results = Get-ChildItem -Path C:\blaa\*.* | where {$_.Lastwritetime -lt (date).addhours(-10)} | Select Name,LastWriteTime; if($results){$results}else{"No files found matching criteria"}
You have an added bracket in your code, that might be a copy artifact, I had to remove. Coded properly would look like this
$results = Get-ChildItem -Path "C:\blaa\*.*" |
Where-Object {$_.Lastwritetime -lt (date).addhours(-10)} |
Select Name,LastWriteTime
if($results){
$results
}else{
"No files found matching criteria"
}
Every hour data comes into every folder of my dir tree. I need to check if it does come in every hour, or of there was any interruption. (For example, no data coming in for 2–3 hours.)
I am trying to write a PowerShell script that will check LastWriteTime for every folder, but it would solve the problem with gaps. If I would check the logs in the morning I would see that all is OK if some data come to folder one hour ago, but was not there a few hours earlier.
So IMHO LastWriteTime is not suitable for it.
Also there is a problem with subfolders. I need to check only the last folder in every dir tree. I do not know how to drop any previous folders like:
Z:\temp #need to drop
Z:\temp\foo #need to drop
Z:\temp\foo\1 #need to check
I had tried to write a script that checks the LastAccessTime, but it throws an error:
Expressions are only allowed as the first element of a pipeline.
The script is as follows:
$rootdir = "Z:\SatData\SatImages\nonprojected\"
$timedec1 = (Get-date).AddHours(-1)
$timedec2 = (Get-date).AddHours(-2)
$timedec3 = (Get-date).AddHours(-3)
$timedec4 = (Get-date).AddHours(-4)
$dir1 = get-childitem $rootdir –recurse | ?{ $_.PSIsContainer } | Select-Object FullName | where-object {$_.lastwritetime -lt $timedec1} | $_.LastWriteTime -gt $timedec4 -and $_.LastWriteTime -lt $timedec3
$dir1
But as I said before it does not solve the problem.
--
The main question exactly about checking of continuously data collections. I would make dir tree bu hands, but I need to way to check if data had come to folder every hour or there was any hours without data...
you can try to setup the powershell script to run in a Windows Scheduler (which will run every hour). This way, the script will only have to check if any data arrived within the past one hour.
I'm trying to find files from the previous day to copy, but my simple get-childitem isn't working. It works with every other switch except -eq. Any suggestions to list files only from the previous day ?
get-childitem c:\users| where-object {$_.LastWriteTime -eq (get-date).adddays(-2)}
You are looking for files that are written at exact time ( hours mins, secs, year, month and 2 days before). Unless the files were written to the second, two (or one) days ago, you will not find them. In other words, you were comparing full DateTime objects and not just a date and hence there is very less chance that they were exactly equal, which seemed to suggest that -eq doesn't work, but other comparisons do.
You probably wanted to just compare the dates, without the times:
$yesterday = (get-date).AddDays(-1).Date
gci c:\users | ?{ $_.LastWriteTime.Date -eq $yesterday}
( also moved the get-date outside, as you probably don't want to do it again and again.)
They are not equal because they differ in time. If you want an exact date match, use the Date property:
get-childitem c:\users| where-object {$_.LastWriteTime.Date -eq (get-date).adddays(-2).Date}