i have a windows server with some hotfolders, all data in the folders will be transfered in a other directory. But sometimes the data stucks in the same folder and the service is still running.
Because of that i wanted to created a powershell script that watch the data creation time and compare it with the actual time. If the difference is more then 10 min send Mail to me, that i can restart the service.
The script should do loop if he detect a data that isn´t transferred and send only one mail and not a mail for every data.
I tryed it for myself and stuck at this point:
Get-ChildItem -Force C:\Ueberwachung
Where-Object{($_.LastWriteTime -le $CurDate.AddMinutes(-10))}
$PSEmailServer = "192.168.0.11"
ForEach ($file in $Files) {Send-MailMessage -to "luis.jablonski#boyn.eu" -from "PowerShell
<ps#boyn.eu>" -Subject "Hotfolder Alarm" -body "Dateien werden nicht bearbeitet"
break
}
the script detectet all datas in the folder see:
Verzeichnis: C:\Ueberwachung
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 30.04.2021 09:39 0 asdsad.pptx
-a---- 30.04.2021 09:39 0 sadsada.docx
-a---- 30.04.2021 09:39 0 sadsadsadsad.txt
-a---- 28.04.2021 16:03 0 test.txt
but doesnt send me a mail, so i think something like my trigger or the compare with the actual time doenst work.
Has anybody a Idea what my Problem ist?
Im glad for every help
Dear Luis
Your script is behaving as I would expect.
Get-ChildItem is output to screen as it is not placed into a list ($Files).
$Files is empty so the loop is never 'looped'.
Issues here:
In the line: ForEach ($file in $Files) {Send-MailMessage... You
have not added anything to $Files. You could pipe Get-ChildItem -Force C:\Ueberwachung Where-Object{($_.LastWriteTime -le $CurDate.AddMinutes(-10))} to $Files to give the list some data.
Other wise $Files is empty so there's nothing to loop through and
therefore, no emails.
As mentioned, Send-MailMessage needs the param -smtpserver $PSEmailServer added, so it knows via which SMTP server to send the
emails.
Last but not least, do you really want an email per file in $Files?
Perhaps you want to change the code so that the list of $Files is in
the email body as a whole?
Not quite sure why break is in there?
Related
#robocopy backup scripts on each server copy their log files to one of two locations: \\backup_server\backup\logs\success if the backup is
#successful, or \\backup_server\backup\logs\fail if the backup fails. This script looks at those two locations to check for failures or
#missing log files, then reports them to the helpdesk.
Function test-servers {
param ($serverlist)
foreach($server in $serverlist){
if(test-path \\backup_server\backup\logs\success\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $success -force
}
elseif(test-path \\backup_server\backup\logs\fail\$server){
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $failure -force
}
else{
add-content -path \\backup_server\backup\logs\errorlog.txt -value $server -force
add-content -path \\backup_server\backup\logs\errorlog.txt -value $missing -force
}
}
}
$date = get-date
$loglocation = "The log file archive can be found in \\backup_server\backup\logs\archive\"
$filename = (get-date).tostring("MM-dd-yyyy")
$success = "Error log found, backup was successful."
$failure = "Error log found, backup was completed with errors."
$missing = "Error log not found, check server."
[System.Collections.ArrayList]$Serverlist = Get-Content \\backup_server\backup\logs\serverloglist.txt
remove-item \\backup_server\backup\logs\errorlog.txt
new-item -path \\backup_server\backup\logs\ -name "errorlog.txt" -itemtype "file"
add-content -path \\backup_server\backup\logs\errorlog.txt -value $date -force
test-servers -serverlist $serverlist
add-content -path \\backup_server\backup\logs\errorlog.txt -value $loglocation -force
#email out log file and clean up file system
send-mailmessage -from "no-reply#example.com" -to "helpdesk#example.com" -smtpserver "mail.example.com" -port 25 -subject "Nightly Robocopy Log Files" -Attachments "\\backup_server\backup\logs\errorlog.txt" -body "Nightly server backup completed, see attachment for details"
exit
Background: I work for a small company IT department and we’re managing around 2 dozen generic file servers which perform simple robocopy batch backups every night. No problems there. In an effort to limit the number of “Server_1 has completed its nightly backup” tickets to our helpdesk, I’ve written a quick-and-dirty script that we use to confirm the completion of our nightly backups. All of the log files from our nightly backups are copied to a “success” or “fail” folder, and this script checks each location and notes which files are in which folder and emails a single ticket. Then the backup logs that failed are copied to an archive folder for the current date and a copy of the primary log file copied along with them. This system works fantastically, the attached log file has all of the proper information about which server failed or succeeded.
So here is my dilemma. The log file that is attached to the send-mailmessage email isn’t the same log file as the source file! The log file left over in the \\backup_server\backup\logs\ directory has the current date and time (although a few seconds after the log file sent via email) on line 1 as does the copy sent by send-mailmessage, but says that every server’s nightly log is missing (see image). So somewhere along the line, 2 log files are being created, with the exact same file name and file path, the correct log first, and the incorrect log seconds later. The only way I can see this happening is if there’s a problem with my function. But it’s only called once, so I don’t see how it could write two different files. The script is called by the Task Scheduler on the server, and is running only once each day according to the task history, reinforced by the fact that we only receive one email from the script each day.
For the sake of brevity, I’ve removed the end of the script where the old logs are copied and cleaned up, which consists of a few lines of copy- and remove-item. I’ve gone over the man pages for each cmdlet in the script and delved into most every post involving send-mailmessage and use of functions on SE.
How can a script that only calls the new-item cmdlet once to create a single file instead create two separate instances of the file? Is it an issue with my function, or does send-mailmessage have unknown-to-me effects on its attachments?
As this is my first post on the SE network, I understand it may not conform to all of the rules. I’ve done my best to go over the FAQs and keep the post on topic. If the question can be improved in any way, your advice is welcome.
Log File Example, on the left is the file we receive as the email attachment, the right is the file left \\backup_server\backup\logs\errorlog.txt
Edit: I updated the script, removing the function at the beginning and simply placing the foreach loop where the function was being called to confirm it was not an issue with the function call within the script.
I adjusted the script a bit, and the task ran in the morning as scheduled, issue persists. But after running the script manually for a few tests, both the log in the archive and the emailed log were the same! So it appears the error was not with the script itself, but with the way I am calling it in the task scheduler. As scheduling a task to run a powershell script is a completely different question, I'll consider this one resolved. I'll leave the question up for a week and then delete, in case any of the 3 people who have seen it are interested in the solution.
Edit: So it appears it was entirely user error! And the worst kind of user error. I had the task set up on 2 different servers, one running a slightly older version of the script, running on the same schedule. So the tasks ran, one overwrote the logs from the other, and I was left with the aforementioned headache. Deleted the second task and now everything works properly! Hooray for stupid mistakes.
I'm a blind user and I keep having to go find what's breaking a few people's computers and it's getting annoying. They always click on "install this Active-X" or "download the free video player now" and I have to then dig through everything.
I whipped up a powershell script to search C:\ for files that have a write time of 5 minutes ago and less for testing purposes the Get-ChildItem part works. Now I just want to get a list of file paths to make my life easier but I am missing something.
Here's what I have so far:
cd c:\
$fileizer = Get-ChildItem -Path . -exclude *.txt,*.log -ErrorAction SilentlyContinue -Recurse| ? {$_.LastWriteTime -gt (Get-Date).AddMinutes(-5)}
echo $fileizer
Here are the results if I just do the Get-ChildItem part of it:
PS C:\Users\tryso> c:\bin\hours.ps1
Directory: C:\bin
Mode LastWriteTime Length Name
---- ------------- ------ ----
-a---- 8/1/2017 2:44 PM 169 hours.ps1
PS C:\>
Obviously I am going to narrow down the path to something more specific than just C:\ like to get into C:\Windows\Temp and C:\Users\ and the likes, I'm just wondering how to parse everything to just give me a list of files and their path.
I'd also like to point out that 5 minutes old is dumb, yes I know. I just did that to make it scream through my C:\ drive because you'd be amazed at how many files have a write time of .5 hours in C:\ LoL.
Ultimately I'd like to figure out how to find new files as opposed to recent write times if that's possible.
Sorry if my query is lame or a repeat, the only close examples I have found don't work for me for some reason and I'm pretty new at PS scripting - but it's getting pretty addicting and awesome LoL.
Thanks a million for any help!
Ryan
The Select-Object cmdlet can pull out the information you're looking for. Often you will want to know more than one piece of info on your results, so dot sourcing isn't going to be the most efficient.
Try something like this to see the full path, size and last modified timestamp:
Get-ChildItem -Path $path -exclude .txt,.log -ErrorAction SilentlyContinue -Recurse | Where-Object {$_.LastWriteTime -gt (Get-Date).AddMinutes(-5)} | Select-Object FullName, Length, LastWriteTime
I'm not Powershell guru but do anyone of you have some script which counting files in an folder and automaticlly send mail to user? Our users have an roaming profile
(\\profile-srv\%username%)
Folder name is the same as username. Is it possible to have an script which will count files in every home folder and send email to user?
domain is: FirmaBis.org total users: 150
So count in ex. aaba and send mail to aaba#firmabis.org
Count next aaca and send mail to aaca#firmabis.org
So script will count files and send mail to user based on folder name and + firmabis.org.
Thanks!
# Get just the directories in the user directory share, ignore any files, loop over them
Get-ChildItem -Path '\\server\share' -Directory | ForEach-Object {
# List all the files in the current folder (loop variable $_ is the folder)
$FilesInFolder = #($_ | Get-ChildItem -Recurse -Force -File)
# Count the files
$NumFiles = $FilesInFolder.Count
# Calculate how many MB they take up, and show it to 2 decimal places
$FileSize = $FilesInFolder.Length | Measure-Object -Sum | Select-ExpandProperty Sum
$FileSize = "{0:.0}MB" -f ($FileSize/1MB)
# Build the email message
$Message = #"
Hi,
The folder for account ($($_.Name)) has $($FilesInFolder.Count) files in it.
They add up to $FileSize
"#
# Send the message through an SMTP server which is configured to allow
# unauthenticated relay emails from the computer running the script.
Send-MailMessage -SmtpServer yourmailserver -To "$($_.Name)#FirmaBis.org" -From 'script#FirmaBis.org' -Body $Message
}
Untested, but ...
I have not seen anything that you have tried so far. Just to give you a set off:
You can get the list of Files count using the combination of Get-childitem and .Count method.
( Get-ChildItem D:\FolderName | measure-object).Count
You can store the output in the variable.
Then, You can pass the variable as a BODY in Send-MailMessage with which you can send emails.
I run perfmon on a server and the log files go into a c:\perfmon folder. The perfmon task restarts each week and the log files just collect in there over time. There could be a range of csv file dates in that folder with different dates.
I would like to write a powershell script that will check that folder to make sure that there is a file there with todays modified date on it. If there isn't one for today I would like a FAILED email so that I know to look at perfmon for issues.
Has anyone written anything like this. I have tried several of the scripts in here and none do it exactly how I would like it to work. This is what I have so far based on other scripts.
It sort of works but it is checking all files and responding for all files as well. If I had 3 files over the last three days and none for today I would get 3 emails saying FAIL. If I have one for today and two older ones I get 3 OK emails. If I just have just one file for today I get one OK email. How do I restrict this to just one email for a fail or success. There could be 50-100 files in that folder after two years and I just want a FAIL if none of them are modified today.
Hope that all makes sense. I'm afraid my Powershell skills are very weak.
$EmailTech = #{
To = 'a#a.com'
SmtpServer = 'relayserver'
From = 'a#a.com'
}
$CompareDate = (Get-Date).AddDays(-1)
Get-ChildItem -Path c:\perflogs\logs | ForEach-Object {
$Files = (Get-ChildItem -Path c:\perflogs\logs\*.csv | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
if ($Files -eq 0)
{
$EmailTech.Subject = 'Perfmon File ServerA - FAIL'
$EmailTech.Body = 'A performance monitor log file was not found on ServerA for today'
Send-MailMessage #EmailTech
}
Else
{
# If we found files it's ok and we don't report it
$EmailTech.Subject = 'Perfmon File ServerA - OK'
$EmailTech.Body = 'A performance monitor log file was found on ServerA for today'
Send-MailMessage #EmailTech
}
}
So I posted for and got some help with this script:
#Command to get list of folders with logfiles where the logfile is at least 30 minutes old send results to variable.
$varlogfile = Get-ChildItem -Path "drive:\folder" -Recurse -Include "logfile" | Where-Object {$_.LastWriteTime -le ((Get-Date).AddMinutes(-30))}
#Add a carriage return to results contained in the variable so email is easier to read
$varlogfile = $varlogfile -join "`r`n"
#Email setup from this line down to next comment
$SMTPServer = "email.server"
$From = "Administrator <administrator#place.com>"
$To = "email","email2"
$Subject = "A Logfile older than 30 minutes has been detected"
$Body = "Logfile(s) older than 30 minutes have been detected in the following folder(s):
$varlogfile
Please login and attempt to process the files manually, if the manual process fails, open a ticket with someone.
From the Admin
"
#Email setup above this line
#If statement that looks for the text blah in the variable, if found email is sent.
if($varlogfile -match "blah")
{
#Command to send email
Send-MailMessage -From $From -to $To -Subject $Subject -Body $Body -SmtpServer $SMTPServer
}
exit 0;
And all that is working perfectly.
Here's the thing though. Over the weekend sometimes we may get a stuck logfile that can't be resolved until Monday morning and it would be nice to be able to turn off alerts when this happens.
Now I'm very new to powershell and this script has been my learning experience. The way I see it is I have 3 choices:
Keep the get-childitem from returning a result if it sees logfile and logfile.stop.
After get-childitem has produced $varlogfile, search $varlogfile for logfile.stop and delete the lines logfile and logfile.stop from it.
Rewrite the whole thing from scratch and produce $varlogfile in a better way that makes it easier to work with the results.
Thoughts and opinions? I'm leaning toward method 2, as I think I can figure that out, but I'm curious if that is a way of pain. I'd really like your input on this.
Thanks people!
I think you're on the right path with your current plan, so I'll help you with approach #2, creating a .sent file when we send an email, to keep the emails from sending multiple times.
Our first step: When an e-mail is sent , we create a new file titles $logfile.MessageSent or something like that. Doing this allows an e-mail to be sent, and for us to also create a flag that we can search for later in the filesystem to determine whether or not we send another e-mail.
#If statement that looks for the text blah in the variable, if found email is sent.
if($varlogfile -match "blah")
{
#Command to send email
Send-MailMessage -From $From -to $To -Subject $Subject -Body $Body -SmtpServer $SMTPServer
New-Item -path $varLogfile.Sent -itemType File
}
Our second step: Modify our Get-ChildItem query to search for the flag:
$varlogfile = Get-ChildItem -Path "drive:\folder" -Recurse -Include "logfile" |
Where-Object {$_.LastWriteTime -le ((Get-Date).AddMinutes(-30))} |
? "($_.BaseName).sent" -notin (Get-ChildItem -Recurse -Include "*.sent" -Path "drive:\folder" | Where-Object {$_.LastWriteTime -le ((Get-Date).AddMinutes(-30))})
This second modification to the $varlogfile step is hard to understand, admittedly. Here is how I've changed it:
Get a lit of files in the drive\folder path, recursively and include logfile
Where the LastWriteTime is older than 30 mins
Where filename.sent is not found in the same directory
The only other thing you'll need to do is add a cleanup task to regularly delete the .sent files, and you're good to go.
Please let me know if you have any questions about this approach, as I want to be sure you understand and to help you learn.