I run perfmon on a server and the log files go into a c:\perfmon folder. The perfmon task restarts each week and the log files just collect in there over time. There could be a range of csv file dates in that folder with different dates.
I would like to write a powershell script that will check that folder to make sure that there is a file there with todays modified date on it. If there isn't one for today I would like a FAILED email so that I know to look at perfmon for issues.
Has anyone written anything like this. I have tried several of the scripts in here and none do it exactly how I would like it to work. This is what I have so far based on other scripts.
It sort of works but it is checking all files and responding for all files as well. If I had 3 files over the last three days and none for today I would get 3 emails saying FAIL. If I have one for today and two older ones I get 3 OK emails. If I just have just one file for today I get one OK email. How do I restrict this to just one email for a fail or success. There could be 50-100 files in that folder after two years and I just want a FAIL if none of them are modified today.
Hope that all makes sense. I'm afraid my Powershell skills are very weak.
$EmailTech = #{
To = 'a#a.com'
SmtpServer = 'relayserver'
From = 'a#a.com'
}
$CompareDate = (Get-Date).AddDays(-1)
Get-ChildItem -Path c:\perflogs\logs | ForEach-Object {
$Files = (Get-ChildItem -Path c:\perflogs\logs\*.csv | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
if ($Files -eq 0)
{
$EmailTech.Subject = 'Perfmon File ServerA - FAIL'
$EmailTech.Body = 'A performance monitor log file was not found on ServerA for today'
Send-MailMessage #EmailTech
}
Else
{
# If we found files it's ok and we don't report it
$EmailTech.Subject = 'Perfmon File ServerA - OK'
$EmailTech.Body = 'A performance monitor log file was found on ServerA for today'
Send-MailMessage #EmailTech
}
}
Related
I'm trying to fetch some registry parameters for about 4'000 machines with PowerShell. There's by no mean any chance that all machines are going to be up at the same time so I'd like to be able to save the list of machines that were already backed up for the script to only get the parameters from not already backed up machines.
I made a CSV file in the form of Machine,Status where the Machine column stores machines names and Status is supposedly either equal to 0 if the script hasn't run yet on this machine and 1 if the machine has already been backed up.
I'm already successfully parsing the machines names from the CSV where Status = 0 and running my backup script for all machines that are up at script run time but I can't figure out how to set the variable Status to 1 back to the CSV (or from what I understood after some reading, to a temporary CSV that will later replace the original one).
My code is like:
$csv = Import-Csv -Path ./list.csv
foreach ($line in $csv) {
$status = $line.Status
$machine = $line.Machine
if ($status -eq "0") {
[REG BACKUP SCRIPT]
$status = 1 #Set the machine as done, but how to put it back to the CSV ?
Export-Csv -Path ./list-new.csv
I don't know where to put Export-Csv and after some attempts
Thanks
#robocopy backup scripts on each server copy their log files to one of two locations: \\backup_server\backup\logs\success if the backup is
#successful, or \\backup_server\backup\logs\fail if the backup fails. This script looks at those two locations to check for failures or
#missing log files, then reports them to the helpdesk.
Function test-servers {
param ($serverlist)
foreach($server in $serverlist){
if(test-path \\backup_server\backup\logs\success\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $success -force
}
elseif(test-path \\backup_server\backup\logs\fail\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $failure -force
}
else{
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $missing -force
}
}
}
$date = get-date
$loglocation = "The log file archive can be found in \\backup_server\backup\logs\archive\"
$filename = (get-date).tostring("MM-dd-yyyy")
$success = "Error log found, backup was successful."
$failure = "Error log found, backup was completed with errors."
$missing = "Error log not found, check server."
[System.Collections.ArrayList]$Serverlist = Get-Content \\backup_server\backup\logs\serverloglist.txt
remove-item \\backup_server\backup\logs\errorlog.txt
new-item -path \\backup_server\backup\logs\ -name "errorlog.txt" -itemtype "file"
add-content -path \\backup_server\backup\logs\errorlog.txt -value $date -force
test-servers -serverlist $serverlist
add-content -path \\backup_server\backup\logs\errorlog.txt -value $loglocation -force
#email out log file and clean up file system
send-mailmessage -from "no-reply#example.com" -to "helpdesk#example.com" -smtpserver "mail.example.com" -port 25 -subject "Nightly Robocopy Log Files" -Attachments "\\backup_server\backup\logs\errorlog.txt" -body "Nightly server backup completed, see attachment for details"
exit
Background: I work for a small company IT department and we’re managing around 2 dozen generic file servers which perform simple robocopy batch backups every night. No problems there. In an effort to limit the number of “Server_1 has completed its nightly backup” tickets to our helpdesk, I’ve written a quick-and-dirty script that we use to confirm the completion of our nightly backups. All of the log files from our nightly backups are copied to a “success” or “fail” folder, and this script checks each location and notes which files are in which folder and emails a single ticket. Then the backup logs that failed are copied to an archive folder for the current date and a copy of the primary log file copied along with them. This system works fantastically, the attached log file has all of the proper information about which server failed or succeeded.
So here is my dilemma. The log file that is attached to the send-mailmessage email isn’t the same log file as the source file! The log file left over in the \\backup_server\backup\logs\ directory has the current date and time (although a few seconds after the log file sent via email) on line 1 as does the copy sent by send-mailmessage, but says that every server’s nightly log is missing (see image). So somewhere along the line, 2 log files are being created, with the exact same file name and file path, the correct log first, and the incorrect log seconds later. The only way I can see this happening is if there’s a problem with my function. But it’s only called once, so I don’t see how it could write two different files. The script is called by the Task Scheduler on the server, and is running only once each day according to the task history, reinforced by the fact that we only receive one email from the script each day.
For the sake of brevity, I’ve removed the end of the script where the old logs are copied and cleaned up, which consists of a few lines of copy- and remove-item. I’ve gone over the man pages for each cmdlet in the script and delved into most every post involving send-mailmessage and use of functions on SE.
How can a script that only calls the new-item cmdlet once to create a single file instead create two separate instances of the file? Is it an issue with my function, or does send-mailmessage have unknown-to-me effects on its attachments?
As this is my first post on the SE network, I understand it may not conform to all of the rules. I’ve done my best to go over the FAQs and keep the post on topic. If the question can be improved in any way, your advice is welcome.
Log File Example, on the left is the file we receive as the email attachment, the right is the file left \\backup_server\backup\logs\errorlog.txt
Edit: I updated the script, removing the function at the beginning and simply placing the foreach loop where the function was being called to confirm it was not an issue with the function call within the script.
I adjusted the script a bit, and the task ran in the morning as scheduled, issue persists. But after running the script manually for a few tests, both the log in the archive and the emailed log were the same! So it appears the error was not with the script itself, but with the way I am calling it in the task scheduler. As scheduling a task to run a powershell script is a completely different question, I'll consider this one resolved. I'll leave the question up for a week and then delete, in case any of the 3 people who have seen it are interested in the solution.
Edit: So it appears it was entirely user error! And the worst kind of user error. I had the task set up on 2 different servers, one running a slightly older version of the script, running on the same schedule. So the tasks ran, one overwrote the logs from the other, and I was left with the aforementioned headache. Deleted the second task and now everything works properly! Hooray for stupid mistakes.
Quick need: Within a SQL Agent job step, I am looking for a way to copy files newer than 60 minutes "ago" to another server. I don't want to re-copy any files older than that. So, copy, xcopy, robocopy are all possibilities as this is a Windows 2008 or higher server.
Background: I'm wiring up a process where serverA has a folder where an ERP application is dumping flat text files to that folder. I need to copy the "newest files" once per hour to serverB so that another application (an SSIS package that kicks off every 60 minutes) can process the file and save the data into SQL Server. In order to only copy "new files" that appear and not copy anything I've already copied over (if exists won't work because I will remove the copied file after SSIS processes it) I need to basically copy files that are only 60 minutes old, or newer and exclude all other files.
For what its worth, the method used will be a SQL Agent Job step so CmdExec and Powershell are both allowed (I am new to PowerShell so I am leaning toward Robocopy).
My solution (and my first PowerShell script) was to use PowerShell's 'Where-Object' cmdlet to pipe only the files that were 60 minutes old or newer to 'Get-ChildItem' and then use 'Copy-Item' within foreach as below:
$srcPath = '\\serverA\ERP\Outbox\'
$destPath = 'C:\ERP\FromERP\Inbox\'
# target files where LastWriteTime >= 60 minutes "ago"
$age = (Get-Date).AddMinutes(-60)
#Write-Output "$age"
$newFiles = Get-ChildItem $srcPath | Where-Object { $_.LastWriteTime -ge $age }
#Write-Output "$newFiles"
foreach ($newFile in $newFiles) {
#Write-Output "Copying $newFile to $destPath"
Copy-Item $newFile.FullName -Destination "$($destPath)$($newFile)"
}
Some of the lines that are commented were simply for debugging purposes but I left them in to help in understanding what is going on.
Note: I experimented with copy, xcopy and robocopy and I think that with some crafty batch language syntax I saw elsewhere on SO I could have gotten them to work (robocopy has a MAXAGE argument but the lowest value is 1 day, which is not granular enough for minutes) but PowerShell really felt simpler and more elegant.
Hopefully someone else can make use of the technique.
Each week we have a backup file replicate over to a folder on our file server. I'm looking for a way to notify me if a file has not been written to that folder in over 7 days.
I found this script online (I apologize to the author for not being able to credit), and I feel like it put's me on the right track. What I'm really looking for though is some kind of output that will tell me if a file hasn't been written at all. I don't need confirmation if the backup is successful.
$lastWrite = (get-item C:\ExampleDirectory).LastWriteTime
$timespan = new-timespan -days 7
if (((get-date) - $lastWrite) -gt $timespan) {
# older
} else {
# newer
}
Want you'll want to do is grab all files in the directory, sort by LastWriteTime and then compare that of the newest file to 7 days ago:
$LastWriteTime = (Get-ChildItem C:\ExampleDirectory |Sort LastWriteTime)[-1].LastWriteTime
if($LastWriteTime -gt [DateTime]::Now.AddDays(-7))
{
# File newer than 7 days is present
}
else
{
# Something is wrong, time to alert!
}
For the alerting part, check out Send-MailMessage or Write-EventLog
I've created a script that I found in other forums to copy a file from one server to another. A bit of background here. We have large PDF files being generated nightly and saved on a share at one site that need to be copied over to our corporate share. These files are pretty sizable (anywhere between 25MB to as high as 65MB), so I can only copy these files off hours. The source repository holds all the original files from the year, so I only want to copy the most recent files. I created a script (or tried to at least) that copies only the most recent files from the SourceHost location to CorpHost share and set up a Task Schedule to run at 7:30pm.
The script kicks off and runs as scheduled, but nothing gets copied over. I don't see any errors being generated from the task schedule and it appears to run as normal as the script returns a "not copying .pdf". Originally, I though that maybe it was bypassing all the files because the generation date is outside the $Max_days range (-1), so I increased it to -2. No luck. Increased it again to -5 - no luck. -10... nothing.
Here's the code sample:
$RemotePath = "\\<CorpHost>\Shared\Packing_Slips\<Site>"
$SourcePath = "\\<SourceHost>\<Site>_packingslips"
$Max_days = "-1"
$Curr_date = Get-Date
#Checking date and then copying file from RemotePath to LocalPath
foreach ($file in (Get-ChildItem $SourcePath))
{
if ($file.LastWriteTime -gt ($Curr_date).AddDays($Max_days))
{
Copy-Item -Path $file.FullName -Destination $RemotePath
}
else
{
"not copying $file"
}
}