Check if data coming continuously - powershell

Every hour data comes into every folder of my dir tree. I need to check if it does come in every hour, or of there was any interruption. (For example, no data coming in for 2–3 hours.)
I am trying to write a PowerShell script that will check LastWriteTime for every folder, but it would solve the problem with gaps. If I would check the logs in the morning I would see that all is OK if some data come to folder one hour ago, but was not there a few hours earlier.
So IMHO LastWriteTime is not suitable for it.
Also there is a problem with subfolders. I need to check only the last folder in every dir tree. I do not know how to drop any previous folders like:
Z:\temp #need to drop
Z:\temp\foo #need to drop
Z:\temp\foo\1 #need to check
I had tried to write a script that checks the LastAccessTime, but it throws an error:
Expressions are only allowed as the first element of a pipeline.
The script is as follows:
$rootdir = "Z:\SatData\SatImages\nonprojected\"
$timedec1 = (Get-date).AddHours(-1)
$timedec2 = (Get-date).AddHours(-2)
$timedec3 = (Get-date).AddHours(-3)
$timedec4 = (Get-date).AddHours(-4)
$dir1 = get-childitem $rootdir –recurse | ?{ $_.PSIsContainer } | Select-Object FullName | where-object {$_.lastwritetime -lt $timedec1} | $_.LastWriteTime -gt $timedec4 -and $_.LastWriteTime -lt $timedec3
$dir1
But as I said before it does not solve the problem.
--
The main question exactly about checking of continuously data collections. I would make dir tree bu hands, but I need to way to check if data had come to folder every hour or there was any hours without data...

you can try to setup the powershell script to run in a Windows Scheduler (which will run every hour). This way, the script will only have to check if any data arrived within the past one hour.

Related

Using Where-Object to find files created within time range

all. I'm trying to take the creation date of a file in one folder and use it to filter files in another while also making sure they contain the phrase 'MS'. This is what I have so far:
$MSdat_time = $MSdat_file.CreationTime
# Defining maximum and minimum auto save creation times with a window of +/- 5 min
$auto_maxtime = ($MSdat_time).AddMinutes(5)
$auto_mintime = ($MSdat_time).AddMinutes(-5)
# Locating any Auto Save files created within time frame using 'MS' pattern as a parameter in case of multiple files
$autsav_file = Get-ChildItem "\\IP.Address\Test Data\Auto Saves" | `
Where-Object {($_.LastWriteTime -ge $auto_mintime) -and ($_.LastWriteTime -le $auto_maxtime)} | `
Select-String -Pattern 'MS' | Select-Object -Unique Path
I put in 'IP address' as a place holder. So far, it's returning nothing even though I know a file with those parameters exists and this section of code was working fine yesterday.
Check that your $MSdat_file variable has a value. It could be that your previous PS sessions had given that variable a value outside of your script.
After assigning $MSdat_file = Get-Item "./out.wav", I was able to get expected output:
PS /Users/ethansmith> /Users/ethansmith/Documents/test.ps1
PS /Users/ethansmith> $autsav_file
Path
----
/Users/ethansmith/Documents/test.ps1
Thank you everyone for your help! #ethan was right, the issue was with my $MSdat_file variable having an old value saved to it. Should have traced my code back to the beginning sooner haha. Thanks everyone!

Powershell: Compare Last Modified to Specific Date and replace with correct Date

I'm still fairly new to powershell, so please bear with me.
I have 2 almost identical directories. Files and folders from the old directory were copied over to a new directory. However, during this transfer process, something happened to the last modified date. The files and folders in the new directory have incorrect last modified dates (ex: today).
Rather than re-doing the transfer process, which would take a long time, I'd like to write something in powershell that will compare the last modified dates of the two directories and correct the dates in the new directory.
I'd also like to check first if file/folder has been modified since the file transfer. There would be no reason to change the date on those files.
What I found from looking around and googling:
Link1 Link2 Link 3 Link 4
I know that I can get the last modified date of a file with:
(Get-Item $filename).LastWriteTime
where $filename is the file directory.
I also came across the following:
dir $directory | ? {$_.lastwritetime -gt "6/1/19" -AND $_.lastwritetime -lt "12/30/19"}
I know I can get information regarding files that were modified between 2 dates. This, I can tweak to make it so the "less than (-lt)" can be used to check files that were not modified past a certain date.
dir $directory | ? {$_.lastwritetime -lt `12/13/19'}
This accomplishes one of my goals. I have a means to check if a file has been modified past a certain or not.
I saw this for changing the value lastwritetime
$folder = Get-Item C:\folder1
$folder.LastWriteTime = (Get-Date)
and realized this was simply
(Get-Item $filename).LastWriteTime = (Get-Date)
Which I could modify to meet my goal of replacing the new file's last write time wit the old file's correct time:
(Get-Item $filename).LastWriteTime = (Get-Item $filename2).LastWriteTime
I suppose what I'm struggling with is kind of putting it all together. I know how to recurse through files/folders for copy-item or even Get-Childitem by adding the "recurse" parameter. But I'm having difficulties wrapping my head around recursively navigating through each directory to change the dates.
Thank you for your help.
You could do the following to compare the LastWriteTime property of the original files and folders to the copies, while keping in mind that files in the copy folder could have been updated since the last transfer date.
# set the date to the last transfer date to determine if the file was updated after that
$lastTransferDate = (Get-Date).AddDays(-10) # just for demo 10 days ago
# set the paths for the rootfolder of the originals and the rootfolder to where everything was copied to
$originalPath = 'D:\OriginalStuff'
$copyPath = 'E:\TransferredStuff'
# loop through the files and folders of the originals
Get-ChildItem -Path $originalPath -Recurse | ForEach-Object {
# create the full path where the copied file of folder is to be found
$copy = Join-Path -Path $copyPath -ChildPath $_.FullName.Substring($originalPath.Length)
# test if this object can be found
if (Test-Path -Path $copy) {
$item = Get-Item -Path $copy
# test if the item has not been updated since the last transfer date
if ($item.LastWriteTime -le $lastTransferDate) {
# set the timestamp the same as the original
$item.LastWriteTime = $_.LastWriteTime
}
}
}
Great job with what you've done so far.
Just put what you have into a foreach statement.
Foreach($item in (gci 'C:\Users\usernamehere\Desktop\folder123' -recurse)){
(Get-Item $item.FullName).LastWriteTime = (Get-Item "C:\Users\usernamehere\Desktop\folderabc\RandomFile.txt").LastWriteTime
}
We wrap the Get-Childitem command with the -recurse flag into parenthesis so that the command executes on it's own and becomes a collection for our foreach command to traverse. $item is the current item in the loop. We will want to use the .FullName property to know the full path to the file for the current item. With that said you will use $item.FullName together for the files you are going to set the date on.

PowerShell script file modify time>10h and return a value if nothing is found

I am trying to compose a script/one liner, which will find files which have been modified over 10 hours ago in a specific folder and if there are no files I need it to print some value or string.
Get-ChildItem -Path C:\blaa\*.* | where {$_.Lastwritetime -lt (date).addhours(-10)}) | Format-table Name,LastWriteTime -HideTableHeaders"
With that one liner I am getting the wanted result when there are files with
modify time over 10 hours, but I also need it to print value/string if there are
no results, so that I can monitor it properly.
The reason for this is to utilize the script/one liner for monitoring purposes.
Those cmdlet Get-ChildItem and where clause you have a would return null if nothing was found. You would have to account for that separately. I would also caution the use of Format-Table for output unless you are just using it for screen reading. If you wanted a "one-liner" you would could this. All PowerShell code can be a one liner if you want it to be.
$results = Get-ChildItem -Path C:\blaa\*.* | where {$_.Lastwritetime -lt (date).addhours(-10)} | Select Name,LastWriteTime; if($results){$results}else{"No files found matching criteria"}
You have an added bracket in your code, that might be a copy artifact, I had to remove. Coded properly would look like this
$results = Get-ChildItem -Path "C:\blaa\*.*" |
Where-Object {$_.Lastwritetime -lt (date).addhours(-10)} |
Select Name,LastWriteTime
if($results){
$results
}else{
"No files found matching criteria"
}

Tracking last modified date on a large file share

I have a file share that takes hours to scan. Recently I have been asked to make a script that can send a mail when over 100 files has been changed within one minute. How can I go about his? Could be powershell, could be anything that would enable such a "scan".
If you want this to run every 5 minutes, you could retrieve all files with a LastWriteTime property value less than 5 minutes old, then group the files by the minute:
$Threshold = (Get-Date).AddMinutes(-5)
$FilesPerMinute = Get-ChildItem C:\folder\share -Recurse |Where-Object {
$_.LastWriteTime -ge $Threshold
} |Group-Object { $_.LastWriteTime.Minute } -NoElement
if($FilesPerMinute |Where {$_.Count -ge 100}){
# alert
}
You might find that a FileSystemWatcher is a better option in your scenario though

Can I keep a PowerShell script from deleting moved and cut files?

My need is to delete all files older than 14 days in a public folder. I have cobbled together a PowerShell script that just about does the trick as I need it. . . The only problem is, if the user moves a file into the folder - as opposed to copying it - my script will delete that file if it was last accessed more than 14 days ago, even if it was moved into the public folder the same day. The same thing happens with cut and paste. So this is a pretty serious problem.
Here is my script:
# Delete all files older than "file_age" days, at "path".
$path = "C:\Users\emcguire\Desktop\Test"
$file_age = "-14"
$current_date = Get-Date
$date_to_delete = $current_date.AddDays($file_age)
Get-ChildItem $path -Recurse | Where-Object { $_.LastAccessTime -lt $date_to_delete } | Remove-Item
I am pretty new to PowerShell, so I may be missing something very obvious. Is there an easy way to check for files that were moved into the folder but do not have their access timestamp changed? Is there a better way to approach this?
I appreciate any help!
The property LastAccessTime is a notoriously avoided property. Try to use LastWriteTime where ever possible first. Additionally, all those properties are stale, meaning they aren't refreshed when you call them. Use this code to call the refresh method to guarantee you've got the fresh file system info before you query the property:
$file = c:\somefile.txt
$fileObj = New-Object System.IO.FileInfo $file
$fileObj.Refresh()
As you're wanting to base your actions off of how long the file has been in the archive directory, you may want to value you the CreationTime attribute. I added a link below to the list of what your choices are in case there's a better one for your needs.
For reference on the refresh method
For reference on properties to value