Alert if no files written > 7 days - powershell

Each week we have a backup file replicate over to a folder on our file server. I'm looking for a way to notify me if a file has not been written to that folder in over 7 days.
I found this script online (I apologize to the author for not being able to credit), and I feel like it put's me on the right track. What I'm really looking for though is some kind of output that will tell me if a file hasn't been written at all. I don't need confirmation if the backup is successful.
$lastWrite = (get-item C:\ExampleDirectory).LastWriteTime
$timespan = new-timespan -days 7
if (((get-date) - $lastWrite) -gt $timespan) {
# older
} else {
# newer
}

Want you'll want to do is grab all files in the directory, sort by LastWriteTime and then compare that of the newest file to 7 days ago:
$LastWriteTime = (Get-ChildItem C:\ExampleDirectory |Sort LastWriteTime)[-1].LastWriteTime
if($LastWriteTime -gt [DateTime]::Now.AddDays(-7))
{
# File newer than 7 days is present
}
else
{
# Something is wrong, time to alert!
}
For the alerting part, check out Send-MailMessage or Write-EventLog

Related

Manage (partial) duplicates in a file directory, based on file name with Powershell

I have a script that moves csv files from a directory to another directory. I only move files modified today
Get-ChildItem $PIM25 -File |
ForEach-Object {
if ($_.LastWriteTimeUtc -ge $TodaysDate.ToUniversalTime()){
Copy-Item $PIM25$_ -Destination $TargetDirectory
Write-Host "$_ was updated today so it's moved"
}
else {
# Write-Host "$_ wasn't copied"
}
}
Now I need to deals with duplicate. I mean partial duplicates...
Here are the type of files i'm working on
As you can see the name’s structure is always the same PIM25^^ID^^HOURS.csv
My goal is to move only one file per ID. And if there is 2 or 3 files with the same ID I want to move only the one with the most recent time (based on the hour in the file name)
Hope someone can help as i'm still a beginner in Powershell. Thanks !

Powershell: Deleting old backup files but leave the recent one even if it's older

I'm tryng to write a .ps1 script that delete files older than 2 days but leave the most recent files also if old.For the delete part internet is full of code's snippet to copy/paste.
For the "leave recent files" i'm in truble.
the structure of the bk's folder is the following:
--Db.yyyy.MM.dd.Native.bak.zip
--Files.yyyy.MM.dd.zip
--Log.yyyy.MM.dd.txt
--AND SO ON WITH THE OLDERS FILES
I wanna keep the most recent trio of this files also if older than 2 days.
If any one have a suggestion to the right approach or a solution, i'm here to learn.
tks to all.
P.S. Is the first time i use powershell and i have to do this script for work.
I would like to get you started so you have an idea how to approach this. It's not too hard actually if you approach it logically. First, you need to obtain the correct files from the backup folder. Then you have to examine each file by parsing the filename.
I wonder if you cannot just take the file date and sort on the oldest? But if you really need to strip the filename, I wrote a very rough script on how such approach could look. Keep in mind, I just did some quick and dirty replace to make it work:
#Get files
$zipFilesInFolder = Get-Childitem –Path "C:\Temp" | Where-Object {!$_.PSIsContainer -and ($_.Name -like "*Files*") } | Sort-Object -Property Name -Descending
Write-Host 'Files found:' $zipFilesInFolder
# Check files found
[datetime]$oldestDate = Get-Date;
[string]$oldestFile;
# Check each file by parsing the filename
Foreach ($i in $zipFilesInFolder) {
$fileDate = $i -replace 'Files.' -replace '.zip',''
$parsedDate = [datetime]::parseexact($fileDate, 'yyyy-MM-dd', $null)
#If we find an older file then the one we currently have in memory, re-assign
if($parsedDate -lt $oldestDate) {
Write-Host 'Older file found than:' $oldestDate ', oldest is now: ' $i
$oldestDate = $parsedDate;
$oldestFile = $i;
}
}
# Display and copy
Write-Host 'Oldest file found:' $oldestFile
I created a directory: C:\Temp with the files:
Files.2021-04-21.zip up to Files.2021-04-26.zip
The output looks like this:
Files found: Files.2021-04-26.zip Files.2021-04-25.zip Files.2021-04-23.zip Files.2021-04-22.zip Files.2021-04-21.zip Files.2021-04-21.zip
Older file found than: 26-4-2021 10:17:01 , oldest is now: Files.2021-04-26.zip
Older file found than: 26-4-2021 00:00:00 , oldest is now: Files.2021-04-25.zip
Older file found than: 25-4-2021 00:00:00 , oldest is now: Files.2021-04-23.zip
Older file found than: 23-4-2021 00:00:00 , oldest is now: Files.2021-04-22.zip
Older file found than: 22-4-2021 00:00:00 , oldest is now: Files.2021-04-21.zip
Oldest file found: Files.2021-04-21.zip
This should be enough to get your assignment done :) Good luck!
AGAIN, I want to stress that you are probably better off by looking at the date modified of the file instead of the filename.
In that case, do this
# Get files
$zipFilesInFolder = Get-Childitem –Path "C:\Temp" | Where-Object {!$_.PSIsContainer -and ($_.Name -like "*Files*") } | Sort-Object -Property LastWriteTime -Descending
Write-Host 'Files found:' $zipFilesInFolder
# Check each file
Foreach ($i in $zipFilesInFolder) {
$i # Shows files from top to bottom, based on last modified date
}

Monitoring DFSR for a specific file

I have a script that creates a large template file (about 40-50Gb) within VMware. The Template file is created on a datastore that is replicated between 2 datacentres using DFSR.
I need to monitor the DFSR status in order that once a specific file has replicated, the script will go on to do some other bits and pieces.
$file = myfile.template
$timeout = new-timespan -minutes 5
start-sleep -seconds 5
$sw = [diagnostics.stopwatch]::StartNew()
while ($sw.elapsed -lt $timeout) {
$DFSRStatus = get-dfsrstatus | where{$_.name -eq $file}
if ($DFSRStatus.something -eq 'ok') {
return
}
start-sleep -seconds 5
}
Exit
Unfortunately, I've not used the DFSR Module. I've tried adding a file to check the replication status, to see what happens when a file gets replicated. But even with creating a 1 Gb test file, I wasn't able to see what happens when the replication is in progress and when it was successful.
Hopefully someone with more experience in DFSR can highlight what the right syntax is to check DFSR for a specific file and what the correct 'success' event is.
Another option: Instead of trying to monitor DFSR, just check to see if the file is at the target location, and is the correct size and date when compared to the original.

Powershell script to detect log file failures

I run perfmon on a server and the log files go into a c:\perfmon folder. The perfmon task restarts each week and the log files just collect in there over time. There could be a range of csv file dates in that folder with different dates.
I would like to write a powershell script that will check that folder to make sure that there is a file there with todays modified date on it. If there isn't one for today I would like a FAILED email so that I know to look at perfmon for issues.
Has anyone written anything like this. I have tried several of the scripts in here and none do it exactly how I would like it to work. This is what I have so far based on other scripts.
It sort of works but it is checking all files and responding for all files as well. If I had 3 files over the last three days and none for today I would get 3 emails saying FAIL. If I have one for today and two older ones I get 3 OK emails. If I just have just one file for today I get one OK email. How do I restrict this to just one email for a fail or success. There could be 50-100 files in that folder after two years and I just want a FAIL if none of them are modified today.
Hope that all makes sense. I'm afraid my Powershell skills are very weak.
$EmailTech = #{
To = 'a#a.com'
SmtpServer = 'relayserver'
From = 'a#a.com'
}
$CompareDate = (Get-Date).AddDays(-1)
Get-ChildItem -Path c:\perflogs\logs | ForEach-Object {
$Files = (Get-ChildItem -Path c:\perflogs\logs\*.csv | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
if ($Files -eq 0)
{
$EmailTech.Subject = 'Perfmon File ServerA - FAIL'
$EmailTech.Body = 'A performance monitor log file was not found on ServerA for today'
Send-MailMessage #EmailTech
}
Else
{
# If we found files it's ok and we don't report it
$EmailTech.Subject = 'Perfmon File ServerA - OK'
$EmailTech.Body = 'A performance monitor log file was found on ServerA for today'
Send-MailMessage #EmailTech
}
}

PowerShell - Move file, Rename & Rotate

I am fairly new to powershell and still learning. I have completed my first script and now trying to add some logging into it. I am able to append to log file OK but stuck on backing up the log and rotating it. This is what I have so far
$CheckFile = Test-Path $Logfilepath
IF ($CheckFile -eq $false) {
$Date = (Get-Date).tostring()
$Date + ' - Automonitor log created - INFO' | Out-File -Append -Force $Logfilepath }
Else {
IF ((Get-Item $Logfilepath).length -gt $Size) {
Move-Item $Logfilepath -Destination $LogsOldFolder -Force}
}
This is where I am stuck. If the file is bigger than 5MB I need it to move to another folder (which I have in the script) but when moved into that folder I only want to keep the 5 newest files to avoid storage issues. I will need the files named like the below.
Automonitor.log.1
Automonitor.log.2
Automonitor.log.3
Automonitor.log.4
Automonitor.log.5
Automonitor.log.1 being the newest created file. So I am really baffled on the process I would take and how to rename the files to match the above format and when new file is copied over, to rename all of them again dependent on date created and deleting the oldest so only 5 files ever exist.
I hope that makes sense, if anyone has any ideas that would be great.
You can go this way:
$a = gci $destfolder
if ( $a.count -gt 5)
{
$a | sort lastwritetime | select -first ($a.count - 5) | remove-item
}
This will get you every file older than the first 5.
So, this script doesnt care about the filenames. If you want that, you should Change the $a = gci $destfolder part to some Wildcards.