Sent mail based on time that previous mail was sent - powershell

I have a script that checks if a site is online and sends a mail if it's down.
The script is configured with a scheduled task that runs every 30 minutes.
The problem is the following:
If the site is down during the weekend or evening (or a day when i'm not monitoring the mailbox), the mails keep being sent.
I was wondering, what method could I use to only send mails if the last time a mail was sent was 3 hours before?
Based on this way, i can send a mail only once every 3 hours.
I have researched the use of registry keys but was wondering if this would be the correct approach.

Rather than the registry, I'd use a simple configuraiton file stored in ProgramData (or AppData should you need a per user configuration)
This make the process of loading / saving parameters and adding new ones very easy.
Also, should you need to save logs and / or other data, you can just put them inside that same folder.
$ConfigFullPath = "$env:APPDATA\My Monitoring solution\config.json"
# This create the config file if none is present.
if (-not (Test-Path $ConfigFullPath)) {
New-Item -ItemType File -Path $ConfigFullPath -Value ([PSCustomObject]#{'LastEmailSent' = [datetime]::MinValue}| ConvertTo-Json) -Force
}
$ConfigFileParams = ConvertFrom-Json -InputObject (get-content "$env:APPDATA\My Monitoring solution\config.json" -raw)
$SendEmail = ([Datetime]::UtcNow - ([DateTime]$ConfigFileParams.LastEmailSent)).TotalHours -ge 3
if ($SendEmail) {
try {
# Send-MailMessage -ErrorAction Stop
# Once email is sent, we update the config
$ConfigFileParams.LastEmailSent = [DateTime]::UtcNow
$ConfigFileParams | ConvertTo-Json | Out-File $ConfigFullPath
}
Catch {
#Manage what to do in case of failure
}
}
That being said, you can definitely use the registry to do the same.
For convenience and ease of use though, I strongly suggest a simpler json file based approach.

I think the best option would be to write the starting hour in a file saved on the disk and everytime you run the script to test if
(currentHour-hourFirstSent)%3==0 && currentMinute<30
You put the name of the file yyyy-mm-dd and if that file exist you read the starting hour from it, if not you create it and save the starting hour in it.

Related

SFTP upload of files that were created since the last run

I am very new to PowerShell and I am in the process of writing a script that performs an SFTP file transfer via WinSCP. I will then be creating a Task on Windows Task Scheduler to run this script every 15 minutes indefinitely. Currently I have this line of code that gets all files in a directory within the last write time that was more than 20 seconds prior:
$filelist = Get-ChildItem C:\Users\lsarm\IHS\test |
where { $_.LastWriteTime -lt (Get-Date).AddSeconds(-20) }
I have been told that this needs to be changed so that it gets all files since the last time the Task was ran (15 minutes prior) instead, but I have had very little luck in finding the answer.
I have tried using Get-ScheduledTask but that only seems to get me basic information about the task and doesn't seem like it is what I need for my script. Also, I have already downloaded the WinSCP .dll file and unblocked it in PowerShell. Any help is welcome, TIA.
Using the time the task ran the last time is imo not reliable. There's still space for you to miss some files or transfer some files repeatedly.
Instead, consider remembering the timestamp of the most recent uploaded file.
Assuming you use Session.PutFiles, you can use code like this:
$transferResult =
$session.PutFiles($sourcePath, $destPath, $False, $transferOptions)
$transferResult.Check()
# Find the latest uploaded file
$latestTransfer =
$transferResult.Transfers |
Sort-Object -Property #{ Expression = { (Get-Item $_.Source).LastWriteTime } } `
-Descending |
Select-Object -First 1
And save the $latestTransfer to a file for the next run. Or loop the code with 15 minutes delay, instead of scheduling it every 15 minutes.
Another option is to remember the already transferred files.
Both options are in more details covered in:
How do I transfer new/modified files only?

Scheduling a Powershell process does not yield the same results as when I run it manually

I wrote a small PowerShell script that I am using to query the Server Log, clean the return values and use some of the results to perform some server maintenance. However, when I schedule the save to file piece is not writing the whole content to the file and it is getting truncated, just like what I ma posting below, exactly. As you can observe, the end of the file is truncated with three dots added to replace the missing values:
Login failed for user 'sa'. Reason: An error occurred while evaluating the password. [CLIENT: 2...
However, if I run the code manually with Local Admin access, the content gets saved to the local file like this, exactly:
Login failed for user 'sa'. Reason: An error occurred while evaluating the password. [CLIENT: 112.103.198.2]
Why is this the case when I schedule the process or PS file to run under a schedule. BTW, I tried to run it under the SYSTEM context with full or highest privileges and even used the same Admin account that I use to run it manually to schedule and still do nt get the full content of the event that I save.
This is creating an issue and I am not able to use the content to process the IP.
Here is the PS code that I am using to query and save the content to file:
$SQL = 'C:\SQL.txt'
Remove-Item $SQL -ErrorAction Ignore
Get-EventLog -LogName Application | Where-Object {$_.EventID -eq 18456} |
Select-Object -Property Message | Out-File $SQL
The problem lies with out-file because it has a default character limit of 80 per line.
You can change it with -width property and give a value of say 200. However set-content doesn't have these limits set in. So it might be a more suitable option.
All that being said, I am not sure why it does it one way when ran manually vs another when the system runs it.
Out-file defaults to unicode when writing files
set-file defaults to ascii when writing files

PST : How not to put a MailboxRequest in a queue

I need to export some PST. Problem is, when I use my foreach-object to export every PST one by one, they are all put in queue. But an other program is supposed to work using the PST at the same time.
dir | foreach-object {
$var = $_
New-MailboxExportRequest -Mailbox $var -Filepath "\\******\******tmp\pst\$var.pst"
}
I dont want my requests to be queued, I want them to be completed before starting an other one. For example, if the first request extracts pst1, i want it to be fully extracted before putting pst2in queue. Is there a way to do this ?
You can't change the Queue Behavior but you can force the exchage server to process only 1 pst each time
to achieve this, you need to edit the MSExchangeMailboxReplication.exe.config file located at:
<Exchange Installation Path>\Program Files\Microsoft\Exchange Server\V14\Bin
MaxActiveMovesPerSourceMDB - Default is 5 - Change it to 1
MaxActiveMovesPerTargetMDB - Default is 2 - Change it to 1
You might also need to change those setting as well:
MaxActiveMovesPerTargetServer
MaxActiveMovesPerSourceServer
of course if you want just to pause the foreach loop you can use the while statement (like Oggew suggested) to make sure the previous job completed before processing the next export
You could add something like this after you pass the New-MailboxExportRequest (inside the foreach-loop). If the export status equals "Queued" or "inprogress" the script will sleep for 15s and then check again. If the value changes status to completed it will pass in the next New-MailboxExportRequest.
while ((Get-MailboxExportRequest -Mailbox $var | Where {$_.Status -eq "Queued" -or $_.Status -eq "InProgress"}))
{
sleep 15
}

Automatic conversion of evtx to plaintext logs

I have a Windows Server 2008 running MicroSoft Exchange. The Audit Logs are stored in evtx and I am trying to export the logs to a 3rd party collector. The agents we have used (Snare Epilog, open source among them) do not recognized the evtx format and do not forward them to the collecting server.
I am attempting to implement a workaround via Powershell and Task Scheduler. The problem I am facing is that while I can access the evtx and save it as a .txt, I am reparsing the entire log every time. However, I would like to only send the new events every 5 minutes or less.
The code I am using is this:
$File = "C:\text.txt; Get-WinEvent -Path C:\test.evtx | Format-Table -AutoSize | Out-File $File -append -width 750
I really appreciate the help!
You could use Get-EventLog, rather than Get-WinEvent, then use the After parameter to only get the last five minutes of events or better still keep track of the most recent event message you have seen.
Here's how to get the last five minutes of the Application log.
Get-EventLog -LogName Application -After $((Get-Date).AddMinutes(-5))

copy / create multiple files

I need to first create and then copy some hundreds of folders & files via powershell (first create them on a local store and then copy them to a remote store).
However, when my foreach loop runs, every 40 or so write attempt fails due to "another process" which blocks the file/folder.
I currently fixed the issue using a simple sleep between every file creation (100ms). However, I wonder if there is no better way to do this? Especially when copying multiple files the sleep would depend on the network latency and dosn't seem to be a good solution to me.
Is there a way to "wait" till the write-operation of a file completed before starting another operation? Or to check if a file is still blocked by one process and wait till it's free again?
Have you tried running your code as a job? Example:
foreach ($file in $files) {
$job = Start-Job -ScriptBlock {
#operation here..
} | Wait-Job
#Log result of job using ex. $job and: '$job | Receive-Job' to get output
}
You could also extend it to create multiple jobs, and then use Get-Job | Wait-Job to wait for the all to finish before you proceed.